liu.seSearch for publications in DiVA
System disruptions
We are currently experiencing disruptions on the search portals due to high traffic. We are working to resolve the issue, you may temporarily encounter an error message.
Change search
Link to record
Permanent link

Direct link
Publications (9 of 9) Show all publications
Elmquist, E. (2025). Sensibly Sound: Human-Centered Integration of Sonification and Visualization. (Doctoral dissertation). Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>Sensibly Sound: Human-Centered Integration of Sonification and Visualization
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

With an increased creation of and access to data, and a growing demand on humans in analytical and decision-making processes, there is a need to further facilitate perceptual and cognitive abilities to support these processes. An approach to support these tasks is to leverage more sensory systems, such as the auditory, to increase information retention to get a more comprehensive understanding of a dataset or situation, while also involving more senses to further engage the user. Audiovisual data interfaces enable the distribution of data variables to any of the two senses to reduce the risk of cognitive overload, or highlight specific data variables by mapping them to both sensory modalities. However, the success of an audiovisual data interface is dependent on the integration of the two senses and how this is utilized in the resulting interface.

This thesis contributes to sonification research by taking a human-centered approach to integrating sonification and visualization. The human-centered approach involved working with domain experts and users during the design process of the sonification, and to create perceptually motivated designs by utilizing how the auditory and visual systems complement each other and how they are integrated in cognition for sense-making.

The thesis contains six studies that explored the integration of sonification and visualization, ranging from literature surveys to design-oriented studies. A state-of-the-art report provided a survey on the integration of sonification and visualization to offer an introduction to the field to new and current practitioners. In another study, a conceptual framework for scene analysis was developed to support the design and analysis of audiovisual data representations. The rest of the studies were design-oriented, where each focused on a specific aspect of how sonification can complement visualization, depending on the domain and tasks of the study. Other than providing concrete examples of audiovisual integrations, the main contributions of the design-oriented studies are provided in their evaluation results in the form of design recommendations. These include the use of redundant mappings for multi-dimensional data analysis, and considering subjective differences of domain experts for situational awareness support in air traffic control. The design-oriented studies also compared sonification designs, where one study showed a potential trade-off of using informative or pleasant designs for astronomy science communication. Another comparison showed that concrete sonification designs can complement abstract visualizations. Lastly, the thesis provides design considerations by comparing the level of redundancy, indexicality, and complexity of each sonification design through the developed conceptual framework. Overall, this thesis offers motivated recommendations for the integration of sonification and visualization.

Abstract [sv]

Med en ökad mängd och tillgång till data, och ett växande krav på människor i analys- och beslutsprocesser, finns det ett behov av att ytterligare utnyttja perceptuella och kognitiva förmågor för att underlätta dessa processer. En metod för att stödja i dessa uppgifter är att utnyttja fler sensoriska system, till exempel hörseln, för att öka informationsflödet och få en mer omfattande förståelse av ett dataset eller situation, samtidigt som fler sinnen involveras för att engagera användaren. Audiovisuella datagränssnitt gör det möjligt att distribuera datavariabler till vilket som helst av de två sinnena för att minska risken för kognitiv överbelastning, eller att lyfta fram specifika datavariabler genom att koppla dem till båda de sensoriska modaliteterna. Hur framgångsrikt ett audiovisuellt datagränssnitt är beror på hur de två sinnena integreras och hur detta utnyttjas i det resulterande gränssnittet.

Den här avhandlingen bidrar till forskningen om sonifiering genom att tillhandahålla en människocentrerad integration av sonifiering och visualisering. Det människo-centrerade tillvägagångssättet innebär att man arbetar med domänexperter och användare under designprocessen av sonifieringen och att man skapar perceptuellt motiverade designer genom att utnyttja hur de auditiva och visuella systemen kompletterar varandra och hur de integreras i kognitionen för meningsskapande.

Avhandlingen innehåller sex studier som undersökte integrationen av sonifiering och visualisering, vilket sträckte sig från litteraturöversikter till designorienterade studier. En state-of-the-art-rapport gav en översikt av integrationen av sonifiering och visualisering för att erbjuda en introduktion till fältet för nya och nuvarande utövare. I en annan studie utvecklades ett konceptuellt ramverk för scenanalys som kan användas som stöd vid utformning och analys av audiovisuella datarepresentationer. Resten av studierna var designorienterade, där de fokuserade på en specifik aspekt av hur sonifiering kan komplettera visualisering, beroende på studiens domän och uppgifter. Förutom att ge konkreta exempel på audiovisuella integrationer är de viktigaste bidragen från studierna deras utvärderingsresultat i form av designrekommendationer. Dessa inkluderar användning av redundanta mappningar för flerdimensionell dataanalys och beaktande av subjektiva skillnader hos domänexperter för stöd till situationsmedvetenhet inom flygledning. Studierna jämförde också olika sonifieringsdesigner, där en studie visade en potentiell avvägning mellan att använda informativ eller behaglig design för astronomisk vetenskapskommunikation, medan en annan studie visade att en konkret sonifieringsdesign kan komplettera abstrakta visualiseringar. Slutligen ger avhandlingen designöverväganden genom att jämföra nivån av redundans, indexikalitet och komplexitet för varje sonifieringsdesign genom det utvecklade konceptuella ramverket. Sammantaget ger denna avhandling motiverade rekommendationer för integrering av sonifiering och visualisering.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2025. p. 76
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 2422
Keywords
Sonification, Visualization, Audiovisual integration, Human-Computer Interaction
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:liu:diva-212494 (URN)10.3384/9789180759144 (DOI)9789180759137 (ISBN)9789180759144 (ISBN)
Public defence
2025-04-25, Dome, Visualiseringscenter C, Campus Norrköping, Norrköping, 13:15 (English)
Opponent
Supervisors
Available from: 2025-03-21 Created: 2025-03-21 Last updated: 2025-03-21Bibliographically approved
Elmquist, E., Ejdbo, M., Bock, A., Thaler, D. S., Ynnerman, A. & Rönnberg, N. (2024). Birdsongification: Contextual and Complementary Sonification for Biology Visualization. Paper presented at 29th International Conference on Auditory Display. Proceedings of the International Conference on Auditory Display, 34-41
Open this publication in new window or tab >>Birdsongification: Contextual and Complementary Sonification for Biology Visualization
Show others...
2024 (English)In: Proceedings of the International Conference on Auditory Display, ISSN 1093-9547, E-ISSN 2168-5126, p. 34-41Article in journal (Refereed) Published
Abstract [en]

Choosing whether to represent data in an abstract or concrete manner through sonification is generally dependent on the applicability of the dataset and personal preference of the designer. For supporting a visualization with a high level of abstraction, a sonification can purposefully act as a complement by giving concrete contextual cues to the data representation with the use of auditory icons. This paper presents a case study of using bird songs as auditory icons to give context to a biology visualization, and explores how additional information of the bird species can be conveyed together with the auditory icons with parameter mapping sonification. The auditory icons are used as a foundation to convey additional information of the dataset, either by creating a parametric auditory icon, or by adding an additional sonification that accompanies the auditory icon. A user evaluation was conducted to validate and compare the different sonification mappings. The results show that there is a subjective difference of how participants perceived the sonifications, where the participants preferred sonifications that had a concrete mapping design. The sonification approaches that are explored in this study have the potential to be applied to more general sonification designs.

National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-208608 (URN)10.21785/icad2024.006 (DOI)
Conference
29th International Conference on Auditory Display
Funder
Knut and Alice Wallenberg Foundation, 2019.0024
Available from: 2024-10-17 Created: 2024-10-17 Last updated: 2025-03-21Bibliographically approved
Rind, A., Enge, K., Iber, M., Rönnberg, N., Lenzi, S., Elmquist, E., . . . Aigner, W. (2024). Integrating Sonification and Visualization – But Why?. In: : . Paper presented at EuroVis 2024.
Open this publication in new window or tab >>Integrating Sonification and Visualization – But Why?
Show others...
2024 (English)Conference paper, Oral presentation with published abstract (Other academic)
Abstract [en]

The research communities studying visualization and sonification share exceptionally similar goals, essentially aiming to make data interpretable to humans. One community uses visual representations, while the other employs auditory (nonspeech) repre- sentations of data. Although the two communities have much in common, they developed mostly in parallel, with only compara- tively few examples of integrated audiovisual data analysis idioms presented so far. This panel brings together researchers from both the fields of visualization and sonification to collectively discuss the question: ‘Integrating Sonification and Visualization – but why?’ 

In the panel discussion, we will tackle this question along two main hypotheses: Combining the modalities to (1) increase the “bandwidth from data to brain,” or (2) to increase a user’s personal engagement during the data analysis process. On the one hand, designers might aim to communicate more data in less time or gain more and more complex insights from the data by using a multi-modal display. This argument follows an understanding that two senses should be capable of processing more information than “just” one. On the other hand, sometimes, a more engaged analysis of the represented data is desirable. Engagement with data visualization stands as a crucial topic in numerous contexts within our field, encouraging “deeper” thinking by expert analysts, readers of data journalism articles, and students in educational settings. We hypothesize that integrating visualization with sonification holds the potential to enhance user engagement during analysis. Through the panel discussion, we want to delve into the spectrum between aiming for bandwidth and engagement, seeking to understand the opportunities and challenges of integrating sonification and visualization. 

National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-204068 (URN)10.2312/evt.20241100 (DOI)
Conference
EuroVis 2024
Available from: 2024-06-03 Created: 2024-06-03 Last updated: 2024-06-13
Enge, K., Elmquist, E., Caiola, V., Rönnberg, N., Rind, A., Iber, M., . . . Aigner, W. (2024). Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and Visualization. Paper presented at EuroVis 2024. Computer graphics forum (Print), 43(3), Article ID e15114.
Open this publication in new window or tab >>Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and Visualization
Show others...
2024 (English)In: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 43, no 3, article id e15114Article in journal (Refereed) Published
Abstract [en]

The research communities studying visualization and sonification for data display and analysis share exceptionally similar goals, essentially making data of any kind interpretable to humans. One community does so by using visual representations of data, and the other community employs auditory (non-speech) representations of data. While the two communities have a lot in common, they developed mostly in parallel over the course of the last few decades. With this STAR, we discuss a collection of work that bridges the borders of the two communities, hence a collection of work that aims to integrate the two techniques into one form of audiovisual display, which we argue to be “more than the sum of the two.” We introduce and motivate a classification system applicable to such audiovisual displays and categorize a corpus of 57 academic publications that appeared between 2011 and 2023 in categories such as reading level, dataset type, or evaluation system, to mention a few. The corpus also enables a meta-analysis of the field, including regularly occurring design patterns such as type of visualization and sonification techniques, or the use of visual and auditory channels, showing an overall diverse field with different designs. An analysis of a co-author network of the field shows individual teams without many interconnections. The body of work covered in this STAR also relates to three adjacent topics: audiovisual monitoring, accessibility, and audiovisual data art. These three topics are discussed individually in addition to the systematically conducted part of this research. The findings of this report may be used by researchers from both fields to understand the potentials and challenges of such integrated designs while hopefully inspiring them to collaborate with experts from the respective other field.

Place, publisher, year, edition, pages
Wiley-Blackwell Publishing Inc., 2024
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-204437 (URN)10.1111/cgf.15114 (DOI)001241901600001 ()
Conference
EuroVis 2024
Funder
Knut and Alice Wallenberg Foundation, KAW 2019.0024
Note

Funding Agencies|Knut and Alice Wallenberg Foundation

Available from: 2024-06-11 Created: 2024-06-11 Last updated: 2025-03-21
Elmquist, E., Enge, K., Rind, A., Navarra, C., Höldrich, R., Iber, M., . . . Rönnberg, N. (2024). Parallel Chords: an audio-visual analytics design for parallel coordinates. Personal and Ubiquitous Computing
Open this publication in new window or tab >>Parallel Chords: an audio-visual analytics design for parallel coordinates
Show others...
2024 (English)In: Personal and Ubiquitous Computing, ISSN 1617-4909, E-ISSN 1617-4917Article in journal (Refereed) Epub ahead of print
Abstract [en]

One of the commonly used visualization techniques for multivariate data is the parallel coordinates plot. It provides users with a visual overview of multivariate data and the possibility to interactively explore it. While pattern recognition is a strength of the human visual system, it is also a strength of the auditory system. Inspired by the integration of the visual and auditory perception in everyday life, we introduce an audio-visual analytics design named Parallel Chords combining both visual and auditory displays. Parallel Chords lets users explore multivariate data using both visualization and sonification through the interaction with the axes of a parallel coordinates plot. To illustrate the potential of the design, we present (1) prototypical data patterns where the sonification helps with the identification of correlations, clusters, and outliers, (2) a usage scenario showing the sonification of data from non-adjacent axes, and (3) a controlled experiment on the sensitivity thresholds of participants when distinguishing the strength of correlations. During this controlled experiment, 35 participants used three different display types, the visualization, the sonification, and the combination of these, to identify the strongest out of three correlations. The results show that all three display types enabled the participants to identify the strongest correlation — with visualization resulting in the best sensitivity. The sonification resulted in sensitivities that were independent from the type of displayed correlation, and the combination resulted in increased enjoyability during usage.

Place, publisher, year, edition, pages
Springer, 2024
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-203454 (URN)10.1007/s00779-024-01795-8 (DOI)2-s2.0-85191992877 (Scopus ID)
Funder
Knut and Alice Wallenberg Foundation, 2019.0024
Available from: 2024-05-13 Created: 2024-05-13 Last updated: 2025-03-21
Elmquist, E., Bock, A., Ynnerman, A. & Rönnberg, N. (2024). Towards a Systematic Scene Analysis Framework for Audiovisual Data Representations. In: Audiovisual Symposium notes: . Paper presented at Audiovisual Symposium. Falun
Open this publication in new window or tab >>Towards a Systematic Scene Analysis Framework for Audiovisual Data Representations
2024 (English)In: Audiovisual Symposium notes, Falun, 2024Conference paper, Oral presentation with published abstract (Refereed)
Place, publisher, year, edition, pages
Falun: , 2024
Keywords
Audiovisual integration, Visualization, Sonification, Scene graph, Scene analysis
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-210422 (URN)
Conference
Audiovisual Symposium
Funder
Knut and Alice Wallenberg Foundation, 2019.0024
Available from: 2024-12-13 Created: 2024-12-13 Last updated: 2025-03-21
Elmquist, E., Bock, A., Lundberg, J., Ynnerman, A. & Rönnberg, N. (2023). SonAir: the design of a sonification of radar data for air traffic control. Journal on Multimodal User Interfaces, 17(3), 137-149
Open this publication in new window or tab >>SonAir: the design of a sonification of radar data for air traffic control
Show others...
2023 (English)In: Journal on Multimodal User Interfaces, ISSN 1783-7677, E-ISSN 1783-8738, Vol. 17, no 3, p. 137-149Article in journal (Refereed) Published
Abstract [en]

Along with the increase of digitalization and automation, a new kind of working environment is emerging in the field of air traffic control. Instead of situating the control tower at the airport, it is now possible to remotely control the airport at any given location, i.e. in a remote tower center (RTC). However, by controlling the airport remotely, the situational awareness and sense of presence might be compromised. By using directional sound, a higher situational awareness could potentially be achieved while also offloading the visual perception which is heavily used in air traffic control. Suitable use cases for sonification in air traffic control were found through workshops with air traffic controllers. A sonification design named SonAir was developed based on the outcome of the workshops, and was integrated with an RTC simulator for evaluating to what degree SonAir could support air traffic controllers in their work. The results suggest that certain aspects of SonAir could be useful for air traffic controllers. A continuous sonification where the spatial positioning of aircraft were conveyed was experienced to be partially useful, but the intrusiveness of SonAir should be further considered to fit the air traffic controllers’ needs. An earcon that conveyed when an aircraft enters the airspace and from which direction was considered useful to support situational awareness.

Place, publisher, year, edition, pages
SPRINGER, 2023
Keywords
Sonification; Air traffic control; Situational awareness; User evaluation
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-196256 (URN)10.1007/s12193-023-00404-x (DOI)001021523300001 ()
Note

Funding: Swedish Transport Authority [TRV-2019/53555]; Knut and Alice Wallenberg Foundation [KAW 2019.0024]

Available from: 2023-07-08 Created: 2023-07-08 Last updated: 2025-03-21
Elmquist, E. & Enge, K. (2022). Towards the Combination of Visualization and Sonification for Cylindrical Displays. In: : . Paper presented at Advanced Visual Interfaces, Frascati, Rome, Italy, 7 June 2022. Zenodo
Open this publication in new window or tab >>Towards the Combination of Visualization and Sonification for Cylindrical Displays
2022 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Immersive environments provide a physical space for audio-visual data analysis. An example of such an environment isthe Norrköping Decision Arena, which provides a cylindrical display together with a circular sound system. This papersets recommendations on what kinds of visualization would benefit from being displayed in this kind of environment andhow sonification could be used as a complement to enable exploratory data analysis. Three visualizations are presented aspotentially interesting for the presentation on a cylindrical display: theme rivers, radial visualizations, and parallel coordinates.

Place, publisher, year, edition, pages
Zenodo, 2022
Keywords
Information visualization, Sonification, Audio-Visual Analytics, Interaction, Cylindrical displays
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:liu:diva-192995 (URN)10.5281/zenodo.6553825 (DOI)
Conference
Advanced Visual Interfaces, Frascati, Rome, Italy, 7 June 2022
Available from: 2023-04-11 Created: 2023-04-11 Last updated: 2025-02-18Bibliographically approved
Elmquist, E., Ejdbo, M., Bock, A. & Rönnberg, N. (2021). OpenSpace Sonification: Complementing Visualization of the Solar System with Sound. In: Areti Andreopoulou, Milena Droumeva, Joseph W. Newbold, Kyla McMullen and Paul Vickers (Ed.), Proceedings of the 26th International Conference on Auditory Display (ICAD 2021): . Paper presented at International Conference on Auditory Display, an Online Conference, June 25–28, 2021 (pp. 135-142). The International Community for Auditory Display
Open this publication in new window or tab >>OpenSpace Sonification: Complementing Visualization of the Solar System with Sound
2021 (English)In: Proceedings of the 26th International Conference on Auditory Display (ICAD 2021) / [ed] Areti Andreopoulou, Milena Droumeva, Joseph W. Newbold, Kyla McMullen and Paul Vickers, The International Community for Auditory Display , 2021, p. 135-142Conference paper, Published paper (Refereed)
Abstract [en]

Data visualization software is commonly used to explore outer space in a planetarium environment, where the visuals of the software is typically accompanied with a narrator and supplementary background music. By letting sound take a bigger role in these kinds of presentations, a more informative and immersive experience can be achieved. The aim of the present study was to explore how sonification can be used as a complement to the visualization software OpenSpace to convey information about the Solar System, as well as increasing the perceived immersiveness for the audience in a planetarium environment. This was investigated by implementing a sonification that conveyed planetary properties, such as the size and orbital period of a planet, by mapping this data to sonification parameters. With a user-centered approach, the sonification was designed iteratively and evaluated in both an online and planetarium environment. The results of the evaluations show that the participants found the sonification informative and interesting, which suggest that sonification can be beneficially used as a complement to visualization in a planetarium environment.

Place, publisher, year, edition, pages
The International Community for Auditory Display, 2021
Keywords
Sonification, OpenSpace, Sonifiering
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-183828 (URN)10.21785/icad2021.018 (DOI)0967090474 (ISBN)9780967090474 (ISBN)
Conference
International Conference on Auditory Display, an Online Conference, June 25–28, 2021
Available from: 2022-04-11 Created: 2022-04-11 Last updated: 2025-03-21Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-5874-6356

Search in DiVA

Show all publications