liu.seSearch for publications in DiVA
System disruptions
We are currently experiencing disruptions on the search portals due to high traffic. We are working to resolve the issue, you may temporarily encounter an error message.
Change search
Link to record
Permanent link

Direct link
Publications (10 of 49) Show all publications
Elmquist, E., Ejdbo, M., Bock, A., Thaler, D. S., Ynnerman, A. & Rönnberg, N. (2024). Birdsongification: Contextual and Complementary Sonification for Biology Visualization. Paper presented at 29th International Conference on Auditory Display. Proceedings of the International Conference on Auditory Display, 34-41
Open this publication in new window or tab >>Birdsongification: Contextual and Complementary Sonification for Biology Visualization
Show others...
2024 (English)In: Proceedings of the International Conference on Auditory Display, ISSN 1093-9547, E-ISSN 2168-5126, p. 34-41Article in journal (Refereed) Published
Abstract [en]

Choosing whether to represent data in an abstract or concrete manner through sonification is generally dependent on the applicability of the dataset and personal preference of the designer. For supporting a visualization with a high level of abstraction, a sonification can purposefully act as a complement by giving concrete contextual cues to the data representation with the use of auditory icons. This paper presents a case study of using bird songs as auditory icons to give context to a biology visualization, and explores how additional information of the bird species can be conveyed together with the auditory icons with parameter mapping sonification. The auditory icons are used as a foundation to convey additional information of the dataset, either by creating a parametric auditory icon, or by adding an additional sonification that accompanies the auditory icon. A user evaluation was conducted to validate and compare the different sonification mappings. The results show that there is a subjective difference of how participants perceived the sonifications, where the participants preferred sonifications that had a concrete mapping design. The sonification approaches that are explored in this study have the potential to be applied to more general sonification designs.

National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-208608 (URN)10.21785/icad2024.006 (DOI)
Conference
29th International Conference on Auditory Display
Funder
Knut and Alice Wallenberg Foundation, 2019.0024
Available from: 2024-10-17 Created: 2024-10-17 Last updated: 2025-03-21Bibliographically approved
Rind, A., Enge, K., Iber, M., Rönnberg, N., Lenzi, S., Elmquist, E., . . . Aigner, W. (2024). Integrating Sonification and Visualization – But Why?. In: : . Paper presented at EuroVis 2024.
Open this publication in new window or tab >>Integrating Sonification and Visualization – But Why?
Show others...
2024 (English)Conference paper, Oral presentation with published abstract (Other academic)
Abstract [en]

The research communities studying visualization and sonification share exceptionally similar goals, essentially aiming to make data interpretable to humans. One community uses visual representations, while the other employs auditory (nonspeech) repre- sentations of data. Although the two communities have much in common, they developed mostly in parallel, with only compara- tively few examples of integrated audiovisual data analysis idioms presented so far. This panel brings together researchers from both the fields of visualization and sonification to collectively discuss the question: ‘Integrating Sonification and Visualization – but why?’ 

In the panel discussion, we will tackle this question along two main hypotheses: Combining the modalities to (1) increase the “bandwidth from data to brain,” or (2) to increase a user’s personal engagement during the data analysis process. On the one hand, designers might aim to communicate more data in less time or gain more and more complex insights from the data by using a multi-modal display. This argument follows an understanding that two senses should be capable of processing more information than “just” one. On the other hand, sometimes, a more engaged analysis of the represented data is desirable. Engagement with data visualization stands as a crucial topic in numerous contexts within our field, encouraging “deeper” thinking by expert analysts, readers of data journalism articles, and students in educational settings. We hypothesize that integrating visualization with sonification holds the potential to enhance user engagement during analysis. Through the panel discussion, we want to delve into the spectrum between aiming for bandwidth and engagement, seeking to understand the opportunities and challenges of integrating sonification and visualization. 

National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-204068 (URN)10.2312/evt.20241100 (DOI)
Conference
EuroVis 2024
Available from: 2024-06-03 Created: 2024-06-03 Last updated: 2024-06-13
Enge, K., Elmquist, E., Caiola, V., Rönnberg, N., Rind, A., Iber, M., . . . Aigner, W. (2024). Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and Visualization. Paper presented at EuroVis 2024. Computer graphics forum (Print), 43(3), Article ID e15114.
Open this publication in new window or tab >>Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and Visualization
Show others...
2024 (English)In: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 43, no 3, article id e15114Article in journal (Refereed) Published
Abstract [en]

The research communities studying visualization and sonification for data display and analysis share exceptionally similar goals, essentially making data of any kind interpretable to humans. One community does so by using visual representations of data, and the other community employs auditory (non-speech) representations of data. While the two communities have a lot in common, they developed mostly in parallel over the course of the last few decades. With this STAR, we discuss a collection of work that bridges the borders of the two communities, hence a collection of work that aims to integrate the two techniques into one form of audiovisual display, which we argue to be “more than the sum of the two.” We introduce and motivate a classification system applicable to such audiovisual displays and categorize a corpus of 57 academic publications that appeared between 2011 and 2023 in categories such as reading level, dataset type, or evaluation system, to mention a few. The corpus also enables a meta-analysis of the field, including regularly occurring design patterns such as type of visualization and sonification techniques, or the use of visual and auditory channels, showing an overall diverse field with different designs. An analysis of a co-author network of the field shows individual teams without many interconnections. The body of work covered in this STAR also relates to three adjacent topics: audiovisual monitoring, accessibility, and audiovisual data art. These three topics are discussed individually in addition to the systematically conducted part of this research. The findings of this report may be used by researchers from both fields to understand the potentials and challenges of such integrated designs while hopefully inspiring them to collaborate with experts from the respective other field.

Place, publisher, year, edition, pages
Wiley-Blackwell Publishing Inc., 2024
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-204437 (URN)10.1111/cgf.15114 (DOI)001241901600001 ()
Conference
EuroVis 2024
Funder
Knut and Alice Wallenberg Foundation, KAW 2019.0024
Note

Funding Agencies|Knut and Alice Wallenberg Foundation

Available from: 2024-06-11 Created: 2024-06-11 Last updated: 2025-03-21
Elmquist, E., Enge, K., Rind, A., Navarra, C., Höldrich, R., Iber, M., . . . Rönnberg, N. (2024). Parallel Chords: an audio-visual analytics design for parallel coordinates. Personal and Ubiquitous Computing
Open this publication in new window or tab >>Parallel Chords: an audio-visual analytics design for parallel coordinates
Show others...
2024 (English)In: Personal and Ubiquitous Computing, ISSN 1617-4909, E-ISSN 1617-4917Article in journal (Refereed) Epub ahead of print
Abstract [en]

One of the commonly used visualization techniques for multivariate data is the parallel coordinates plot. It provides users with a visual overview of multivariate data and the possibility to interactively explore it. While pattern recognition is a strength of the human visual system, it is also a strength of the auditory system. Inspired by the integration of the visual and auditory perception in everyday life, we introduce an audio-visual analytics design named Parallel Chords combining both visual and auditory displays. Parallel Chords lets users explore multivariate data using both visualization and sonification through the interaction with the axes of a parallel coordinates plot. To illustrate the potential of the design, we present (1) prototypical data patterns where the sonification helps with the identification of correlations, clusters, and outliers, (2) a usage scenario showing the sonification of data from non-adjacent axes, and (3) a controlled experiment on the sensitivity thresholds of participants when distinguishing the strength of correlations. During this controlled experiment, 35 participants used three different display types, the visualization, the sonification, and the combination of these, to identify the strongest out of three correlations. The results show that all three display types enabled the participants to identify the strongest correlation — with visualization resulting in the best sensitivity. The sonification resulted in sensitivities that were independent from the type of displayed correlation, and the combination resulted in increased enjoyability during usage.

Place, publisher, year, edition, pages
Springer, 2024
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-203454 (URN)10.1007/s00779-024-01795-8 (DOI)2-s2.0-85191992877 (Scopus ID)
Funder
Knut and Alice Wallenberg Foundation, 2019.0024
Available from: 2024-05-13 Created: 2024-05-13 Last updated: 2025-03-21
Gorenko, I., Besançon, L., Forsell, C. & Rönnberg, N. (2024). Supporting Astrophysical Visualization with Sonification. In: : . Paper presented at EuroVis 2024.
Open this publication in new window or tab >>Supporting Astrophysical Visualization with Sonification
2024 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This poster presents initial design steps exploring how sonification can be used to support visualization for comprehension of space and time in astronomical data. Radio signals travel at the speed of light. With a visualization of the universe, it is possible to travel faster than light and pass the radio waves leaving earth. We can then travel back in time. We propose to use sonification consisting of songs representing each year as a musical journey through space and time to create an engaging experience.

Keywords
Human-centered computing, Auditory feedback, Sonification
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-204067 (URN)10.2312/evp.20241091 (DOI)
Conference
EuroVis 2024
Available from: 2024-06-03 Created: 2024-06-03 Last updated: 2024-06-13Bibliographically approved
Elmquist, E., Bock, A., Ynnerman, A. & Rönnberg, N. (2024). Towards a Systematic Scene Analysis Framework for Audiovisual Data Representations. In: Audiovisual Symposium notes: . Paper presented at Audiovisual Symposium. Falun
Open this publication in new window or tab >>Towards a Systematic Scene Analysis Framework for Audiovisual Data Representations
2024 (English)In: Audiovisual Symposium notes, Falun, 2024Conference paper, Oral presentation with published abstract (Refereed)
Place, publisher, year, edition, pages
Falun: , 2024
Keywords
Audiovisual integration, Visualization, Sonification, Scene graph, Scene analysis
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-210422 (URN)
Conference
Audiovisual Symposium
Funder
Knut and Alice Wallenberg Foundation, 2019.0024
Available from: 2024-12-13 Created: 2024-12-13 Last updated: 2025-03-21
Rönnberg, N. & Löwgren, J. (2024). Understanding Modal Synergy for Exploration. In: : . Paper presented at Audiovisual Symposium 2024. Dalarna University, Falun, Sweden.
Open this publication in new window or tab >>Understanding Modal Synergy for Exploration
2024 (English)Conference paper, Oral presentation with published abstract (Other academic)
Place, publisher, year, edition, pages
Dalarna University, Falun, Sweden.: , 2024
Keywords
Musical sonification, Modal synergy, Images, Interaction
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-210284 (URN)
Conference
Audiovisual Symposium 2024
Available from: 2024-12-09 Created: 2024-12-09 Last updated: 2024-12-17Bibliographically approved
Rönnberg, N. & Börütecene, A. (2024). Use of Generative AI for Fictional Field Studies in Design Courses. In: Adjunct Proceedings of the 2024 Nordic Conference on Human-Computer Interaction: . Paper presented at 2024 Nordic Conference on Human-Computer Interaction, Uppsala, SWEDEN, OCT 13-16, 2024. New York, NY, USA: ASSOC COMPUTING MACHINERY, Article ID 23.
Open this publication in new window or tab >>Use of Generative AI for Fictional Field Studies in Design Courses
2024 (English)In: Adjunct Proceedings of the 2024 Nordic Conference on Human-Computer Interaction, New York, NY, USA: ASSOC COMPUTING MACHINERY , 2024, article id 23Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we present how we used generative AI (GenAI) as a pedagogical tool for students taking a course in tangible interaction design. In this course, the students design different physical-digital objects (PDOs) to learn designing, sketching and prototyping with code and hardware. However, due to the short course duration these PDOs are not evaluated or explored with any kind of field or user study. Therefore we gave the students the exercise of doing user interviews with GenAI to explore their design ideas further. With this paper, we contribute a description and the outcomes of this approach, and highlight the pedagogical implications for student learning.

Place, publisher, year, edition, pages
New York, NY, USA: ASSOC COMPUTING MACHINERY, 2024
Keywords
Design, Education, Field study, Generative AI, User interview
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:liu:diva-208600 (URN)10.1145/3677045.3685439 (DOI)001331863500023 ()2-s2.0-85206591357 (Scopus ID)9798400709654 (ISBN)
Conference
2024 Nordic Conference on Human-Computer Interaction, Uppsala, SWEDEN, OCT 13-16, 2024
Available from: 2024-10-17 Created: 2024-10-17 Last updated: 2024-12-13Bibliographically approved
Rönnberg, N. (2024). Where Visualization Fails, Sonification Speaks. In: : . Paper presented at EuroVis 2024.
Open this publication in new window or tab >>Where Visualization Fails, Sonification Speaks
2024 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Traveling by public transport can be challenging for a visually impaired traveler. However, visual information can be supported by sonification, the use of non-speech sound to convey information about data. This research project aims to explore how sonification can be used to provide information to a traveler at a bus stop. Three situations are described together with different sonification design approaches that will later be further developed and evaluated. 

Keywords
Human-centered computing, Auditory feedback, Sonification
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-204043 (URN)10.2312/evp.20241099 (DOI)
Conference
EuroVis 2024
Available from: 2024-06-03 Created: 2024-06-03 Last updated: 2024-06-13Bibliographically approved
Ziemer, T., Lenzi, S., Rönnberg, N., Hermann, T. & Bresin, R. (2023). Introduction to the special issue on design and perception of interactive sonification. Journal on Multimodal User Interfaces, 17, 213-214
Open this publication in new window or tab >>Introduction to the special issue on design and perception of interactive sonification
Show others...
2023 (English)In: Journal on Multimodal User Interfaces, ISSN 1783-7677, E-ISSN 1783-8738, Vol. 17, p. 213-214Article in journal, Editorial material (Other academic) Published
Place, publisher, year, edition, pages
Springer, 2023
Keywords
Sonification
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-199882 (URN)10.1007/s12193-023-00425-6 (DOI)001098004600001 ()
Available from: 2024-01-02 Created: 2024-01-02 Last updated: 2025-02-10
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-1334-0624

Search in DiVA

Show all publications