liu.seSearch for publications in DiVA
Change search
Link to record
Permanent link

Direct link
Publications (10 of 20) Show all publications
Yu, P., Nordman, A., Koc-Januchta, M., Schönborn, K., Besançon, L. & Vrotsou, K. (2024). Revealing Interaction Dynamics: Multi-Level Visual Exploration of User Strategies with an Interactive Digital Environment. IEEE Transactions on Visualization and Computer Graphics
Open this publication in new window or tab >>Revealing Interaction Dynamics: Multi-Level Visual Exploration of User Strategies with an Interactive Digital Environment
Show others...
2024 (English)In: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506Article in journal (Refereed) Epub ahead of print
Abstract [en]

We present a visual analytics approach for multi-level visual exploration of users' interaction strategies in an interactive digital environment. The use of interactive touchscreen exhibits in informal learning environments, such as museums and science centers, often incorporate frameworks that classify learning processes, such as Bloom's taxonomy, to achieve better user engagement and knowledge transfer. To analyze user behavior within these digital environments, interaction logs are recorded to capture diverse exploration strategies. However, analysis of such logs is challenging, especially in terms of coupling interactions and cognitive learning processes, and existing work within learning and educational contexts remains limited. To address these gaps, we develop a visual analytics approach for analyzing interaction logs that supports exploration at the individual user level and multi-user comparison. The approach utilizes algorithmic methods to identify similarities in users' interactions and reveal their exploration strategies. We motivate and illustrate our approach through an application scenario, using event sequences derived from interaction log data in an experimental study conducted with science center visitors from diverse backgrounds and demographics. The study involves 14 users completing tasks of increasing complexity, designed to stimulate different levels of cognitive learning processes. We implement our approach in an interactive visual analytics prototype system, named VISID, and together with domain experts, discover a set of task-solving exploration strategies, such as “cascading” and “nested-loop', which reflect different levels of learning processes from Bloom's taxonomy. Finally, we discuss the generalizability and scalability of the presented system and the need for further research with data acquired in the wild.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2024
Keywords
Visual analytics, Visualization systems and tools, Interaction logs, Visualization techniques, Visual learning
National Category
Interaction Technologies
Identifiers
urn:nbn:se:liu:diva-209035 (URN)10.1109/tvcg.2024.3456187 (DOI)39255130 (PubMedID)
Available from: 2024-11-04 Created: 2024-11-04 Last updated: 2024-12-10
Gorenko, I., Besançon, L., Forsell, C. & Rönnberg, N. (2024). Supporting Astrophysical Visualization with Sonification. In: : . Paper presented at EuroVis 2024.
Open this publication in new window or tab >>Supporting Astrophysical Visualization with Sonification
2024 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This poster presents initial design steps exploring how sonification can be used to support visualization for comprehension of space and time in astronomical data. Radio signals travel at the speed of light. With a visualization of the universe, it is possible to travel faster than light and pass the radio waves leaving earth. We can then travel back in time. We propose to use sonification consisting of songs representing each year as a musical journey through space and time to create an engaging experience.

Keywords
Human-centered computing, Auditory feedback, Sonification
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:liu:diva-204067 (URN)10.2312/evp.20241091 (DOI)
Conference
EuroVis 2024
Available from: 2024-06-03 Created: 2024-06-03 Last updated: 2024-06-13Bibliographically approved
Dai, S., Smiley, J., Dwyer, T., Ens, B. & Besançon, L. (2023). RoboHapalytics: A Robot Assisted Haptic Controller for Immersive Analytics. IEEE Transactions on Visualization and Computer Graphics, 29(1), 451-461
Open this publication in new window or tab >>RoboHapalytics: A Robot Assisted Haptic Controller for Immersive Analytics
Show others...
2023 (English)In: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 29, no 1, p. 451-461Article in journal (Refereed) Published
Abstract [en]

Immersive environments offer new possibilities for exploring three-dimensional volumetric or abstract data. However, typicalmid-air interaction offers little guidance to the user in interacting with the resulting visuals. Previous work has explored the use of hapticcontrols to give users tangible affordances for interacting with the data, but these controls have either: been limited in their range andresolution; were spatially fixed; or required users to manually align them with the data space. We explore the use of a robot arm withhand tracking to align tangible controls under the user’s fingers as they reach out to interact with data affordances. We begin witha study evaluating the effectiveness of a robot-extended slider control compared to a large fixed physical slider and a purely virtualmid-air slider. We find that the robot slider has similar accuracy to the physical slider but is significantly more accurate than mid-airinteraction. Further, the robot slider can be arbitrarily reoriented, opening up many new possibilities for tangible haptic interaction withimmersive visualisations. We demonstrate these possibilities through three use-cases: selection in a time-series chart; interactiveslicing of CT scans; and finally exploration of a scatter plot depicting time-varying socio-economic data

Place, publisher, year, edition, pages
IEEE, 2023
Keywords
Haptic Feedback, Human Centred Interaction, Robotic Arm
National Category
Robotics
Identifiers
urn:nbn:se:liu:diva-194033 (URN)10.1109/tvcg.2022.3209433 (DOI)
Note

Funding agencies: . This research was supported under the Australian Research Council’s Discovery Projectsfunding scheme (project number DP180100755) and the Knut andAlice Wallenberg Foundation (grant KAW 2019.0024)

Available from: 2023-05-22 Created: 2023-05-22 Last updated: 2023-05-22
Barriere, J., Frank, F., Besançon, L., Samuel, A., Saada, V., Billy, E., . . . Robert, J. (2023). Scientific Integrity Requires Publishing Rebuttals and Retracting Problematic Papers. STEM CELL REVIEWS AND REPORTS, 19, 568-572
Open this publication in new window or tab >>Scientific Integrity Requires Publishing Rebuttals and Retracting Problematic Papers
Show others...
2023 (English)In: STEM CELL REVIEWS AND REPORTS, ISSN 2629-3269, Vol. 19, p. 568-572Article in journal (Refereed) Published
Abstract [en]

Recently, an article by Seneff et al. entitled "Innate immunosuppression by SARS-CoV-2 mRNA vaccinations: The role of G-quadruplexes, exosomes, and MicroRNAs" was published in Food and Chemical Toxicology (FCT). Here, we describe why this article, which contains unsubstantiated claims and misunderstandings such as "billions of lives are potentially at risk" with COVID-19 mRNA vaccines, is problematic and should be retracted. We report here our request to the editor of FCT to have our rebuttal published, unfortunately rejected after three rounds of reviewing. Fighting the spread of false information requires enormous effort while receiving little or no credit for this necessary work, which often even ends up being threatened. This need for more scientific integrity is at the heart of our advocacy, and we call for large support, especially from editors and publishers, to fight more effectively against deadly disinformation.

Place, publisher, year, edition, pages
SPRINGER, 2023
Keywords
Misinformation; COVID-19; SARS-CoV-2 mRNA Vaccines; Cancer; Retraction; UN SDG3; Public Health; Science Integrity; Pseudoscience
National Category
Information Studies
Identifiers
urn:nbn:se:liu:diva-189760 (URN)10.1007/s12015-022-10465-2 (DOI)000873452200001 ()36287337 (PubMedID)
Available from: 2022-11-07 Created: 2022-11-07 Last updated: 2023-11-02Bibliographically approved
Robert, J., Frank, F., Besançon, L., Samuel, A., Saada, V., Billy, E., . . . Barrière, J. (2022). Covidiots et cancer. Y a-t-il une ânerie qu’ILS n’ont pas proférée ?. Innovations & Thérapeutiques en Oncologie, 8(4), 179-181
Open this publication in new window or tab >>Covidiots et cancer. Y a-t-il une ânerie qu’ILS n’ont pas proférée ?
Show others...
2022 (French)In: Innovations & Thérapeutiques en Oncologie, E-ISSN 2431-3203, Vol. 8, no 4, p. 179-181Article in journal (Refereed) Published
Place, publisher, year, edition, pages
John Libbey Publishing, 2022
National Category
Clinical Medicine
Identifiers
urn:nbn:se:liu:diva-194034 (URN)10.1684/ito.2022.0324 (DOI)
Available from: 2023-05-22 Created: 2023-05-22 Last updated: 2023-05-22
Besançon, L., Schönborn, K., Sundén, E., Yin, H., Rising, S., Westerdahl, P., . . . Ynnerman, A. (2022). Exploring and Explaining Climate Change: Exploranation as a Visualization Pedagogy for Societal Action. In: : . Paper presented at VIS4GOOD, a workshop on Visualization for Social Good, held as part of IEEE VIS 2022. Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Exploring and Explaining Climate Change: Exploranation as a Visualization Pedagogy for Societal Action
Show others...
2022 (English)Conference paper, Oral presentation only (Other academic)
Abstract [en]

Engaging mass audiences with crucial societal issues, such as cli-mate change, can be provided through interactive exhibits designed around the paradigm of exploranation. We present example inter-active installations in the newly founded Wadstr¨oms Exploranation Laboratory that explain various aspects of climate change while allowing public participants to explore the real scientific data. We describe how effects and causes of climate change can be communi-cated by two of the installations that allow for interactive opportuni-ties to explore the underlying data while gaining insight into climate change sources and effects. We close with implications for future work on exploranation as an emerging visualization pedagogy in public spaces.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2022
National Category
Climate Research
Identifiers
urn:nbn:se:liu:diva-194046 (URN)
Conference
VIS4GOOD, a workshop on Visualization for Social Good, held as part of IEEE VIS 2022
Available from: 2023-05-22 Created: 2023-05-22 Last updated: 2023-10-10Bibliographically approved
Sereno, M., Gosset, S., Besançon, L. & Isenberg, T. (2022). Hybrid Touch/Tangible Spatial Selection in Augmented Reality. Paper presented at 24th Eurographics/IEEE VGTC Conference on Visualization (EuroVis), Rome, ITALY, jun 12-17, 2022. Computer graphics forum (Print), 41(3), 403-415
Open this publication in new window or tab >>Hybrid Touch/Tangible Spatial Selection in Augmented Reality
2022 (English)In: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 41, no 3, p. 403-415Article in journal (Refereed) Published
Abstract [en]

We study tangible touch tablets combined with Augmented Reality Head-Mounted Displays (AR-HMDs) to perform spatial 3D selections. We are primarily interested in the exploration of 3D unstructured datasets such as cloud points or volumetric datasets. AR-HMDs immerse users by showing datasets stereoscopically, and tablets provide a set of 2D exploration tools. Because AR-HMDs merge the visualization, interaction, and the users' physical spaces, users can also use the tablets as tangible objects in their 3D space. Nonetheless, the tablets' touch displays provide their own visualization and interaction spaces, separated from those of the AR-HMD. This raises several research questions compared to traditional setups. In this paper, we theorize, discuss, and study different available mappings for manual spatial selections using a tangible tablet within an AR-HMD space. We then study the use of this tablet within a 3D AR environment, compared to its use with a 2D external screen.

Place, publisher, year, edition, pages
John Wiley & Sons, 2022
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:liu:diva-187979 (URN)10.1111/cgf.14550 (DOI)000842261500035 ()
Conference
24th Eurographics/IEEE VGTC Conference on Visualization (EuroVis), Rome, ITALY, jun 12-17, 2022
Available from: 2022-09-01 Created: 2022-09-01 Last updated: 2022-09-27
Barrière, J., Frank, F., Besançon, L., Samuel, A., Saada, V., Billy, E., . . . Robert, J. (2022). La lutte contre la fraude scientifique : une tâche ingrate mais nécessaire. Bulletin du Cancer, 109(10), 996-998
Open this publication in new window or tab >>La lutte contre la fraude scientifique : une tâche ingrate mais nécessaire
Show others...
2022 (French)In: Bulletin du Cancer, ISSN 0007-4551, E-ISSN 1769-6917, Vol. 109, no 10, p. 996-998Article in journal, Editorial material (Other academic) Published
Place, publisher, year, edition, pages
ELSEVIER MASSON, CORP OFF, 2022
National Category
Cancer and Oncology
Identifiers
urn:nbn:se:liu:diva-187980 (URN)10.1016/j.bulcan.2022.06.013 (DOI)000879083800006 ()36055805 (PubMedID)
Available from: 2022-09-01 Created: 2022-09-01 Last updated: 2022-11-23
Besançon, L., Flahault, A. & Meyerowitz-Katz, G. (2022). Mobility during the pandemic: how did our movements shape the course of COVID-19?. Journal of Travel Medicine, 29(3), Article ID taac055.
Open this publication in new window or tab >>Mobility during the pandemic: how did our movements shape the course of COVID-19?
2022 (English)In: Journal of Travel Medicine, ISSN 1195-1982, E-ISSN 1708-8305, Vol. 29, no 3, article id taac055Article in journal, Editorial material (Other academic) Published
Abstract [en]

In this manuscript, we critically assess the evidence around various methods of reducing mobility, and how these have impacted the course of the coronavirus disease 2019 (COVID-19) pandemic. We further highlight the difficulty in assessing the effectiveness of such measures before giving directions for future research.

Place, publisher, year, edition, pages
OXFORD UNIV PRESS INC, 2022
National Category
Public Health, Global Health, Social Medicine and Epidemiology
Identifiers
urn:nbn:se:liu:diva-185606 (URN)10.1093/jtm/taac055 (DOI)000798303400001 ()35511717 (PubMedID)
Available from: 2022-06-09 Created: 2022-06-09 Last updated: 2024-07-04Bibliographically approved
Sereno, M., Besançon, L. & Isenberg, T. (2022). Point specification in collaborative visualization for 3D scalar fields using augmented reality. Virtual Reality, 26, 1317-1334
Open this publication in new window or tab >>Point specification in collaborative visualization for 3D scalar fields using augmented reality
2022 (English)In: Virtual Reality, ISSN 1359-4338, E-ISSN 1434-9957, Vol. 26, p. 1317-1334Article in journal (Refereed) Published
Abstract [en]

We compared three techniques to specify 3D positions for collaborative augmented reality (AR) visualization. AR head-mounted displays allow multiple users to share the same physical space, while keeping seamless social interactions. Interactions being key parts of exploratory visualization tasks, we adapted from the virtual reality literature three distinct techniques to specify points in 3D space, such as for placing annotations for which they cannot rely on existing data objects. We evaluated these techniques on their accuracy and speed, the users subjective workload and preferences, as well as their co-presence, mutual understanding, and behavior in collaborative tasks. Our results suggest that all the three techniques provide good mutual understanding and co-presence among collaborators. They differ, however, in the way users behave, their accuracy, and their speed.

Place, publisher, year, edition, pages
SPRINGER LONDON LTD, 2022
Keywords
AR; CSCW; 3D visualization; 3D point specification
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:liu:diva-183409 (URN)10.1007/s10055-021-00614-2 (DOI)000757751600001 ()
Available from: 2022-03-10 Created: 2022-03-10 Last updated: 2023-04-04Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-7207-1276

Search in DiVA

Show all publications