liu.seSök publikationer i DiVA
Ändra sökning
Avgränsa sökresultatet
1 - 35 av 35
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Träffar per sida
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
Markera
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    Elmquist, Elias
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Enge, Kajetan
    St. Pölten University of Applied Sciences, St. Pölten, Austria; University of Music and Performing Arts Graz, Graz, Austria.
    Rind, Alexander
    t. Pölten University of Applied Sciences, St. Pölten, Austria.
    Navarra, Carlo
    Linköpings universitet, Institutionen för tema, Tema Miljöförändring. Linköpings universitet, Filosofiska fakulteten.
    Höldrich, Robert
    University of Music and Performing Arts Graz, Graz, Austria.
    Iber, Michael
    St. Pölten University of Applied Sciences, St. Pölten, Austria.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Aigner, Wolfgang
    St. Pölten University of Applied Sciences, St. Pölten, Austria.
    Rönnberg, Niklas
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Parallel Chords: an audio-visual analytics design for parallel coordinates2024Ingår i: Personal and Ubiquitous Computing, ISSN 1617-4909, E-ISSN 1617-4917Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    One of the commonly used visualization techniques for multivariate data is the parallel coordinates plot. It provides users with a visual overview of multivariate data and the possibility to interactively explore it. While pattern recognition is a strength of the human visual system, it is also a strength of the auditory system. Inspired by the integration of the visual and auditory perception in everyday life, we introduce an audio-visual analytics design named Parallel Chords combining both visual and auditory displays. Parallel Chords lets users explore multivariate data using both visualization and sonification through the interaction with the axes of a parallel coordinates plot. To illustrate the potential of the design, we present (1) prototypical data patterns where the sonification helps with the identification of correlations, clusters, and outliers, (2) a usage scenario showing the sonification of data from non-adjacent axes, and (3) a controlled experiment on the sensitivity thresholds of participants when distinguishing the strength of correlations. During this controlled experiment, 35 participants used three different display types, the visualization, the sonification, and the combination of these, to identify the strongest out of three correlations. The results show that all three display types enabled the participants to identify the strongest correlation — with visualization resulting in the best sensitivity. The sonification resulted in sensitivities that were independent from the type of displayed correlation, and the combination resulted in increased enjoyability during usage.

    Ladda ner fulltext (pdf)
    fulltext
  • 2.
    Falk, Martin
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Swedish e-Science Research Centre (SeRC), Sweden.
    Tobiasson, Victor
    Science for Life Laboratory, Department of Biochemistry and Biophysics, Stockholm University, Solna, Sweden.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Swedish e-Science Research Centre (SeRC), Sweden.
    Hansen, Charles
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Kahlert School of Computing, University of Utah.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Swedish e-Science Research Centre (SeRC), Sweden.
    A Visual Environment for Data Driven Protein Modeling and Validation2023Ingår i: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    In structural biology, validation and verification of new atomic models are crucial and necessary steps which limit the production of reliable molecular models for publications and databases. An atomic model is the result of meticulous modeling and matching and is evaluated using a variety of metrics that provide clues to improve and refine the model so it fits our understanding of molecules and physical constraints. In cryo electron microscopy (cryo-EM) the validation is also part of an iterative modeling process in which there is a need to judge the quality of the model during the creation phase. A shortcoming is that the process and results of the validation are rarely communicated using visual metaphors.

    This work presents a visual framework for molecular validation. The framework was developed in close collaboration with domain experts in a participatory design process. Its core is a novel visual representation based on 2D heatmaps that shows all available validation metrics in a linear fashion, presenting a global overview of the atomic model and provide domain experts with interactive analysis tools. Additional information stemming from the underlying data, such as a variety of local quality measures, is used to guide the user's attention toward regions of higher relevance. Linked with the heatmap is a three-dimensional molecular visualization providing the spatial context of the structures and chosen metrics. Additional views of statistical properties of the structure are included in the visual framework. We demonstrate the utility of the framework and its visual guidance with examples from cryo-EM.

    Ladda ner fulltext (pdf)
    fulltext
  • 3.
    Broman, Emma
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Faherty, Jacqueline
    Amer Museum Nat Hist, NY USA.
    Kreidberg, Laura
    Max Planck Inst Astron, Germany.
    Zieba, Sebastian
    Max Planck Inst Astron, Germany.
    Hansen, Charles
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Univ Utah, UT USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    ExoplanetExplorer: Contextual Visualization of Exoplanet Systems2023Ingår i: 2023 IEEE VISUALIZATION AND VISUAL ANALYTICS, VIS, IEEE COMPUTER SOC , 2023, s. 81-85Konferensbidrag (Refereegranskat)
    Abstract [en]

    An exoplanet is a planet outside of our solar system. Researchers study known exoplanets and gather data about them through observations and derived data. Ongoing efforts involve finding planets with an environment that supports life, which likely exists in what is known as the habitable zone around a star. Through a participatory design process, we developed a tool that enables the exploration of exoplanet attribute data and provides contextual visual information in a 3D spatial view that seamlessly presents

  • 4.
    Brossier, Mathis
    et al.
    Univ Paris Saclay, France.
    Skånberg, Robin
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Besançon, Lonni
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Linares, Mathieu
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Isenberg, Tobias
    Univ Paris Saclay, France.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Moliverse: Contextually embedding the microcosm into the universe2023Ingår i: Computers & graphics, ISSN 0097-8493, E-ISSN 1873-7684, Vol. 112, s. 22-30Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    We present Moliverse, an integration of the molecular visualization framework VIAMD into the astronomical visualization software OpenSpace, allowing us to bridge the two extreme ends of the scale spectrum to show, for example, the gas composition in a planets atmosphere or molecular structures in comet trails and can empower the creation of educational exhibitions. For that purpose we do not use a linear scale traversal but break the scale continuity and show molecular simulations as focus in the context of celestial bodies. We demonstrate the application of our concept in two storytelling scenarios and envision the application both for science presentations to lay audiences and for dedicated exploration, potentially also in a molecule-only environment.

    Ladda ner fulltext (pdf)
    fulltext
  • 5.
    Elmquist, Elias
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Lundberg, Jonas
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV.
    Rönnberg, Niklas
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    SonAir: the design of a sonification of radar data for air traffic control2023Ingår i: Journal on Multimodal User Interfaces, ISSN 1783-7677, E-ISSN 1783-8738, Vol. 17, nr 3, s. 137-149Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Along with the increase of digitalization and automation, a new kind of working environment is emerging in the field of air traffic control. Instead of situating the control tower at the airport, it is now possible to remotely control the airport at any given location, i.e. in a remote tower center (RTC). However, by controlling the airport remotely, the situational awareness and sense of presence might be compromised. By using directional sound, a higher situational awareness could potentially be achieved while also offloading the visual perception which is heavily used in air traffic control. Suitable use cases for sonification in air traffic control were found through workshops with air traffic controllers. A sonification design named SonAir was developed based on the outcome of the workshops, and was integrated with an RTC simulator for evaluating to what degree SonAir could support air traffic controllers in their work. The results suggest that certain aspects of SonAir could be useful for air traffic controllers. A continuous sonification where the spatial positioning of aircraft were conveyed was experienced to be partially useful, but the intrusiveness of SonAir should be further considered to fit the air traffic controllers’ needs. An earcon that conveyed when an aircraft enters the airspace and from which direction was considered useful to support situational awareness.

    Ladda ner fulltext (pdf)
    fulltext
  • 6.
    Dieckmann, Mark E
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Folini, D.
    Univ Lyon, France.
    Falk, Martin
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Steneteg, Peter
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Walder, R.
    Univ Lyon, France.
    Three-dimensional structure and stability of discontinuities between unmagnetized pair plasma and magnetized electron-proton plasma2023Ingår i: New Journal of Physics, E-ISSN 1367-2630, Vol. 25, nr 6, artikel-id 063017Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    We study with a 3D particle-in-cell simulation discontinuities between an electron-positron pair plasma and magnetized electrons and protons. A pair plasma is injected at one simulation boundary with a speed 0.6c along its normal. It expands into an electron-proton plasma and a magnetic field that points orthogonally to the injection direction. Diamagnetic currents expel the magnetic field from within the pair plasma and pile it up in front of it. It pushes electrons, which induces an electric field pulse ahead of the magnetic one. This initial electromagnetic pulse (EMP) confines the pair plasma magnetically and accelerates protons electrically. The fast flow of the injected pair plasma across the protons behind the initial EMP triggers the filamentation instability. Some electrons and positrons cross the injection boundary and build up a second EMP. Electron-cyclotron drift instabilities perturb the plasma ahead of both EMPs seeding a Rayleigh-Taylor (RT)-type instability. Despite equally strong perturbations ahead of both EMPs, the second EMP is much more stable than the initial one. We attribute the rapid collapse of the initial EMP to the filamentation instability, which perturbed the plasma behind it. The RT-type instability transforms the planar EMPs into transition layers, in which magnetic flux ropes and electrostatic forces due to uneven numbers of electrons and positrons slow down and compress the pair plasma and accelerate protons. In our simulation, the expansion speed of the pair cloud decreased by about an order of magnitude and its density increased by the same factor. Its small thickness implies that it is capable of separating a relativistic pair outflow from an electron-proton plasma, which is essential for collimating relativistic jets of pair plasma in collisionless astrophysical plasma.

    Ladda ner fulltext (pdf)
    fulltext
  • 7.
    Costa, Jonathas
    et al.
    NYU, NY 10003 USA.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Univ Utah, UT 84112 USA.
    Emmart, Carter
    Amer Museum Nat Hist, NY 10024 USA.
    Hansen, Charles
    Univ Utah, UT 84112 USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV. Univ Utah, UT 84112 USA.
    Silva, Claudio
    NYU, NY 10003 USA.
    Interactive Visualization of Atmospheric Effects for Celestial Bodies2021Ingår i: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 27, nr 2, s. 785-795Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    We present an atmospheric model tailored for the interactive visualization of planetary surfaces. As the exploration of the solar system is progressing with increasingly accurate missions and instruments, the faithful visualization of planetary environments is gaining increasing interest in space research, mission planning, and science communication and education. Atmospheric effects are crucial in data analysis and to provide contextual information for planetary data. Our model correctly accounts for the non-linear path of the light inside the atmosphere (in Earths case), the light absorption effects by molecules and dust particles, such as the ozone layer and the Martian dust, and a wavelength-dependent phase function for Mie scattering. The mode focuses on interactivity, versatility, and customization, and a comprehensive set of interactive controls make it possible to adapt its appearance dynamically. We demonstrate our results using Earth and Mars as examples. However, it can be readily adapted for the exploration of other atmospheres found on, for example, of exoplanets. For Earths atmosphere, we visually compare our results with pictures taken from the International Space Station and against the CIE clear sky model. The Martian atmosphere is reproduced based on available scientific data, feedback from domain experts, and is compared to images taken by the Curiosity rover. The work presented here has been implemented in the OpenSpace system, which enables interactive parameter setting and real-time feedback visualization targeting presentations in a wide range of environments, from immersive dome theaters to virtual reality headsets.

    Ladda ner fulltext (pdf)
    fulltext
  • 8.
    Elmquist, Elias
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Ejdbo, Malin
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Rönnberg, Niklas
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    OpenSpace Sonification: Complementing Visualization of the Solar System with Sound2021Ingår i: Proceedings of the 26th International Conference on Auditory Display (ICAD 2021) / [ed] Areti Andreopoulou, Milena Droumeva, Joseph W. Newbold, Kyla McMullen and Paul Vickers, The International Community for Auditory Display , 2021, s. 135-142Konferensbidrag (Refereegranskat)
    Abstract [en]

    Data visualization software is commonly used to explore outer space in a planetarium environment, where the visuals of the software is typically accompanied with a narrator and supplementary background music. By letting sound take a bigger role in these kinds of presentations, a more informative and immersive experience can be achieved. The aim of the present study was to explore how sonification can be used as a complement to the visualization software OpenSpace to convey information about the Solar System, as well as increasing the perceived immersiveness for the audience in a planetarium environment. This was investigated by implementing a sonification that conveyed planetary properties, such as the size and orbital period of a planet, by mapping this data to sonification parameters. With a user-centered approach, the sonification was designed iteratively and evaluated in both an online and planetarium environment. The results of the evaluations show that the participants found the sonification informative and interesting, which suggest that sonification can be beneficially used as a complement to visualization in a planetarium environment.

    Ladda ner fulltext (pdf)
    fulltext
  • 9.
    Lan, Fangfei
    et al.
    Univ Utah, UT 84112 USA.
    Young, Michael
    Univ Utah, UT 84112 USA.
    Anderson, Lauren
    Carnegie Inst Sci, DC 20005 USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Univ Utah, UT 84112 USA.
    Borkin, Michelle A.
    Northeastern Univ, MA 02115 USA.
    Forbes, Angus G.
    Univ Calif Santa Cruz, CA 95064 USA.
    Kollmeier, Juna A.
    Carnegie Inst Sci, DC 20005 USA.
    Wang, Bei
    Univ Utah, UT 84112 USA; Univ Utah, UT 84112 USA.
    Visualization in Astrophysics: Developing New Methods, Discovering Our Universe, and Educating the Earth2021Ingår i: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 40, nr 3, s. 635-663Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    We present a state-of-the-art report on visualization in astrophysics. We survey representative papers from both astrophysics and visualization and provide a taxonomy of existing approaches based on data analysis tasks. The approaches are classified based on five categories: data wrangling, data exploration, feature identification, object reconstruction, as well as education and outreach. Our unique contribution is to combine the diverse viewpoints from both astronomers and visualization experts to identify challenges and opportunities for visualization in astrophysics. The main goal is to provide a reference point to bring modern data analysis and visualization techniques to the rich datasets in astrophysics.

    Ladda ner fulltext (pdf)
    fulltext
  • 10.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. University of Utah, USA.
    Axelsson, Emil
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Costa, Jonathas
    New York University, USA.
    Payne, Gene
    University of Utah, USA.
    Acinapura, Micah
    American Museum of Natural History, USA.
    Trakinski, Vivian
    American Museum of Natural History, USA.
    Emmart, Carter
    American Museum of Natural History, USA.
    Silva, Cláudio
    New York University, USA.
    Hansen, Charles
    University of Utah, USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV. University of Utah, USA.
    OpenSpace: A System for Astrographics2020Ingår i: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 26, nr 1, s. 633-642Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Human knowledge about the cosmos is rapidly increasing as instruments and simulations are generating new data supporting the formation of theory and understanding of the vastness and complexity of the universe. OpenSpace is a software system that takes on the mission of providing an integrated view of all these sources of data and supports interactive exploration of the known universe from the millimeter scale showing instruments on spacecrafts to billions of light years when visualizing the early universe. The ambition is to support research in astronomy and space exploration, science communication at museums and in planetariums as well as bringing exploratory astrographics to the class room. There is a multitude of challenges that need to be met in reaching this goal such as the data variety, multiple spatio-temporal scales, collaboration capabilities, etc. Furthermore, the system has to be flexible and modular to enable rapid prototyping and inclusion of new research results or space mission data and thereby shorten the time from discovery to dissemination. To support the different use cases the system has to be hardware agnostic and support a range of platforms and interaction paradigms. In this paper we describe how OpenSpace meets these challenges in an open source effort that is paving the path for the next generation of interactive astrographics.

    Ladda ner fulltext (pdf)
    OpenSpace: A System for Astrographics
  • 11.
    Williams, Francis
    et al.
    NYU, NY 10003 USA.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Doraiswamy, Harish
    NYU, NY 10003 USA.
    Donatelli, Cassandra
    Tufts Univ, MA 02155 USA.
    Hall, Kayla
    Univ Washington, WA 98195 USA.
    Summers, Adam
    Univ Washington, WA 98195 USA.
    Panozzo, Daniele
    NYU, NY 10003 USA.
    Silva, Claudio T.
    NYU, NY 10003 USA.
    Unwind: Interactive Fish Straightening2020Ingår i: PROCEEDINGS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI20), ASSOC COMPUTING MACHINERY , 2020Konferensbidrag (Refereegranskat)
    Abstract [en]

    The ScanAllFish project is a large-scale effort to scan all the worlds 33,100 known species of fishes. It has already generated thousands of volumetric CT scans of fish species which are available on open access platforms such as the Open Science Framework. To achieve a scanning rate required for a project of this magnitude, many specimens are grouped together into a single tube and scanned all at once. The resulting data contain many fish which are often bent and twisted to fit into the scanner. Our system, Unwind, is a novel interactive visualization and processing tool which extracts, unbends, and untwists volumetric images of fish with minimal user interaction. Our approach enables scientists to interactively unwarp these volumes to remove the undesired torque and bending using a piecewise-linear skeleton extracted by averaging iso-surfaces of a harmonic function connecting the head and tail of each fish. The result is a volumetric dataset of a individual, straight fish in a canonical pose defined by the marine biologist expert user. We have developed Unwind in collaboration with a team of marine biologists: Our system has been deployed in their labs, and is presently being used for dataset construction, biomechanical analysis, and the generation of figures for scientific publication.

  • 12.
    Bladin, Kalle
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Axelsson, Emil
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Broberg, Erik
    Linköpings universitet, Tekniska fakulteten.
    Emmart, Carter
    Amer Museum Nat Hist, NY 10024 USA.
    Ljung, Patric
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap. Linköpings universitet, Tekniska fakulteten. NYU, NY 10003 USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Globe Browsing: Contextualized Spatio-Temporal Planetary Surface Visualization2018Ingår i: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 24, nr 1, s. 802-811Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Results of planetary mapping are often shared openly for use in scientific research and mission planning. In its raw format, however, the data is not accessible to non-experts due to the difficulty in grasping the context and the intricate acquisition process. We present work on tailoring and integration of multiple data processing and visualization methods to interactively contextualize geospatial surface data of celestial bodies for use in science communication. As our approach handles dynamic data sources, streamed from online repositories, we are significantly shortening the time between discovery and dissemination of data and results. We describe the image acquisition pipeline, the pre-processing steps to derive a 2.5D terrain, and a chunked level-of-detail, out-of-core rendering approach to enable interactive exploration of global maps and high-resolution digital terrain models. The results are demonstrated for three different celestial bodies. The first case addresses high-resolution map data on the surface of Mars. A second case is showing dynamic processes. such as concurrent weather conditions on Earth that require temporal datasets. As a final example we use data from the New Horizons spacecraft which acquired images during a single flyby of Pluto. We visualize the acquisition process as well as the resulting surface data. Our work has been implemented in the OpenSpace software [8], which enables interactive presentations in a range of environments such as immersive dome theaters. interactive touch tables. and virtual reality headsets.

    Ladda ner fulltext (pdf)
    fulltext
  • 13.
    Bock, Alexander
    et al.
    NYU, USA.
    Hansen, Charles
    Univ Utah, UT 84112 USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    OpenSpace: Bringing NASA Missions to the Public2018Ingår i: IEEE Computer Graphics and Applications, ISSN 0272-1716, E-ISSN 1558-1756, Vol. 38, nr 5, s. 112-118Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    This viewpoint presents OpenSpace, an open-source astrovisualization software project designed to bridge the gap between scientific discoveries and their public dissemination. A wealth of data exists for space missions from NASA and other sources. OpenSpace brings together this data and combines it in a range of immersive settings. Through non-linear storytelling and guided exploration, interactive immersive experiences help the public to engage with advanced space mission data and models, and thus be better informed and educated about NASA missions, the solar system and outer space. We demonstrate this capability by exploring the OSIRIS-Rex mission.

    Ladda ner fulltext (pdf)
    fulltext
  • 14.
    Bock, Alexander
    et al.
    NYU, USA.
    Axelsson, Emil
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Emmart, Carter
    Amer Museum Nat Hist, NY 10024 USA.
    Kuznetsova, Masha
    Community Coordinated Modeling Ctr, MD USA.
    Hansen, Charles
    Univ Utah, UT 84112 USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Univ Utah, UT 84112 USA.
    OpenSpace: Changing the Narrative of Public Dissemination in Astronomical Visualization from What to How2018Ingår i: IEEE Computer Graphics and Applications, ISSN 0272-1716, E-ISSN 1558-1756, Vol. 38, nr 3, s. 44-57Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    This article presents the development of open-source software called OpenSpace that bridges the gap between scientific discoveries and public dissemination and thus paves the way for the next generation of science communication and data exploration. The article describes how the platform enables interactive presentations of dynamic and time-varying processes by domain experts to the general public. The concepts are demonstrated through four cases: Image acquisitions of the New Horizons and Rosetta spacecraft, the dissemination of space weather phenomena, and the display of high-resolution planetary images. Each case has been presented at public events with great success. These cases highlight the details of data acquisition, rather than presenting the final results, showing the audience the value of supporting the efforts of the scientific discovery.

  • 15. Beställ onlineKöp publikationen >>
    Bock, Alexander
    Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik.
    Tailoring visualization applications for tasks and users2018Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
    Abstract [en]

    Exponential increases in available computational resources over the recent decades have fueled an information explosion in almost every scientific field. This has led to a societal change shifting from an information-poor research environment to an over-abundance of information. As many of these cases involve too much information to directly comprehend, visualization proves to be an effective tool to gain insight into these large datasets. While visualization has been used since the beginning of mankind, its importance is only increasing as the exponential information growth widens the difference between the amount of gathered data and the relatively constant human ability to ingest information. Visualization, as a methodology and tool of transforming complex data into an intuitive visual representation can leverage the combined computational resources and the human cognitive capabilities in order to mitigate this growing discrepancy.

    A large portion of visualization research is, directly or indirectly, targets users in an application domain, such as medicine, biology, physics, or others. Applied research is aimed at the creation of visualization applications or systems that solve a specific problem within the domain. Combining prior research and applying it to a concrete problem enables the possibility to compare and determine the usability and usefulness of existing visualization techniques. These applications can only be effective when the domain experts are closely involved in the design process, leading to an iterative workflow that informs its form and function. These visualization solutions can be separated into three categories: Exploration, in which users perform an initial study of data, Analysis, in which an established technique is repeatedly applied to a large number of datasets, and Communication in which findings are published to a wider public audience.

    This thesis presents five examples of application development in finite element modeling, medicine, urban search & rescue, and astronomy and astrophysics. For the finite element modeling, an exploration tool for simulations of stress tensors in a human heart uses a compression method to achieve interactive frame rates. In the medical domain, an analysis system aimed at guiding surgeons during Deep Brain Stimulation interventions fuses multiple modalities in order to improve their outcome. A second analysis application is targeted at the Urban Search & Rescue community supporting the extraction of injured victims and enabling a more sophisticated decision making strategy. For the astronomical domain, first, an exploration application enables the analysis of time-varying volumetric plasma simulations to improving these simulations and thus better predict space weather. A final system focusses on combining all three categories into a single application that enables the same tools to be used for Exploration, Analysis, and Communication, thus requiring the handling of large coordinate systems, and high-fidelity rendering of planetary surfaces and spacecraft operations.

    Delarbeten
    1. Coherency-Based Curve Compression for High-Order Finite Element Model Visualization
    Öppna denna publikation i ny flik eller fönster >>Coherency-Based Curve Compression for High-Order Finite Element Model Visualization
    Visa övriga...
    2012 (Engelska)Ingår i: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 18, nr 12, s. 2315-2324Artikel i tidskrift (Refereegranskat) Published
    Abstract [en]

    Finite element (FE) models are frequently used in engineering and life sciences within time-consuming simulations. In contrast with the regular grid structure facilitated by volumetric data sets, as used in medicine or geosciences, FE models are defined over a non-uniform grid. Elements can have curved faces and their interior can be defined through high-order basis functions, which pose additional challenges when visualizing these models. During ray-casting, the uniformly distributed sample points along each viewing ray must be transformed into the material space defined within each element. The computational complexity of this transformation makes a straightforward approach inadequate for interactive data exploration. In this paper, we introduce a novel coherency-based method which supports the interactive exploration of FE models by decoupling the expensive world-to-material space transformation from the rendering stage, thereby allowing it to be performed within a precomputation stage. Therefore, our approach computes view-independent proxy rays in material space, which are clustered to facilitate data reduction. During rendering, these proxy rays are accessed, and it becomes possible to visually analyze high-order FE models at interactive frame rates, even when they are time-varying or consist of multiple modalities. Within this paper, we provide the necessary background about the FE data, describe our decoupling method, and introduce our interactive rendering algorithm. Furthermore, we provide visual results and analyze the error introduced by the presented approach.

    Ort, förlag, år, upplaga, sidor
    Institute of Electrical and Electronics Engineers (IEEE), 2012
    Nyckelord
    Finite element visualization, GPU-base dray-casting
    Nationell ämneskategori
    Teknik och teknologier
    Identifikatorer
    urn:nbn:se:liu:diva-86633 (URN)10.1109/TVCG.2012.206 (DOI)000310143100035 ()
    Anmärkning

    Funding Agencies|Swedish Research Council (VR)|2011-4113|Excellence Center at Linkoping and Lund in Information Technology (ELLIIT)||Swedish e-Science Research Centre (SeRC)||

    Tillgänglig från: 2012-12-20 Skapad: 2012-12-20 Senast uppdaterad: 2018-05-21
    2. Guiding Deep Brain Stimulation Interventions by Fusing Multimodal Uncertainty Regions
    Öppna denna publikation i ny flik eller fönster >>Guiding Deep Brain Stimulation Interventions by Fusing Multimodal Uncertainty Regions
    Visa övriga...
    2013 (Engelska)Konferensbidrag, Publicerat paper (Övrigt vetenskapligt)
    Abstract [en]

    Deep Brain Stimulation (DBS) is a surgical intervention that is known to reduce or eliminate the symptoms of common movement disorders, such as Parkinson.s disease, dystonia, or tremor. During the intervention the surgeon places electrodes inside of the patient.s brain to stimulate speci.c regions. Since these regions span only a couple of millimeters, and electrode misplacement has severe consequences, reliable and accurate navigation is of great importance. Usually the surgeon relies on fused CT and MRI data sets, as well as direct feedback from the patient. More recently Microelectrode Recordings (MER), which support navigation by measuring the electric .eld of the patient.s brain, are also used. We propose a visualization system that fuses the different modalities: imaging data, MER and patient checks, as well as the related uncertainties, in an intuitive way to present placement-related information in a consistent view with the goal of supporting the surgeon in the .nal placement of the stimulating electrode. We will describe the design considerations for our system, the technical realization, present the outcome of the proposed system, and provide an evaluation.

    Ort, förlag, år, upplaga, sidor
    IEEE conference proceedings, 2013
    Nationell ämneskategori
    Datavetenskap (datalogi)
    Identifikatorer
    urn:nbn:se:liu:diva-92857 (URN)10.1109/PacificVis.2013.6596133 (DOI)000333746600013 ()9781467347976 (ISBN)
    Konferens
    IEEE Pacific Visualization, 26 February - 1 March 2013, Sydney, Australia
    Forskningsfinansiär
    ELLIIT - The Linköping‐Lund Initiative on IT and Mobile CommunicationsSwedish e‐Science Research CenterVetenskapsrådet, 2011-4113
    Tillgänglig från: 2013-05-27 Skapad: 2013-05-27 Senast uppdaterad: 2018-05-21
    3. Supporting Urban Search & Rescue Mission Planning through Visualization-Based Analysis
    Öppna denna publikation i ny flik eller fönster >>Supporting Urban Search & Rescue Mission Planning through Visualization-Based Analysis
    2014 (Engelska)Ingår i: Proceedings of the Vision, Modeling, and Visualization Conference 2014, Eurographics - European Association for Computer Graphics, 2014Konferensbidrag, Publicerat paper (Refereegranskat)
    Abstract [en]

    We propose a visualization system for incident commanders in urban search~\&~rescue scenarios that supports access path planning for post-disaster structures. Utilizing point cloud data acquired from unmanned robots, we provide methods for assessment of automatically generated paths. As data uncertainty and a priori unknown information make fully automated systems impractical, we present a set of viable access paths, based on varying risk factors, in a 3D environment combined with the visual analysis tools enabling informed decisions and trade-offs. Based on these decisions, a responder is guided along the path by the incident commander, who can interactively annotate and reevaluate the acquired point cloud to react to the dynamics of the situation. We describe design considerations for our system, technical realizations, and discuss the results of an expert evaluation.

    Ort, förlag, år, upplaga, sidor
    Eurographics - European Association for Computer Graphics, 2014
    Nationell ämneskategori
    Datavetenskap (datalogi)
    Identifikatorer
    urn:nbn:se:liu:diva-117772 (URN)10.2312/vmv.20141275 (DOI)978-3-905674-74-3 (ISBN)
    Konferens
    Vision, Modeling, and Visualization
    Projekt
    ELLIIT; VR; SeRC
    Forskningsfinansiär
    ELLIIT - The Linköping‐Lund Initiative on IT and Mobile CommunicationsSwedish e‐Science Research CenterVetenskapsrådet, 2011-4113
    Tillgänglig från: 2015-05-08 Skapad: 2015-05-08 Senast uppdaterad: 2018-05-21Bibliografiskt granskad
    4. An interactive visualization system for urban search & rescue mission planning
    Öppna denna publikation i ny flik eller fönster >>An interactive visualization system for urban search & rescue mission planning
    2014 (Engelska)Ingår i: 12th IEEE International Symposium on Safety, Security and Rescue Robotics, SSRR 2014 - Symposium Proceedings, Institute of Electrical and Electronics Engineers Inc. , 2014, nr 7017652Konferensbidrag, Publicerat paper (Refereegranskat)
    Abstract [en]

    We present a visualization system for incident commanders in urban search and rescue scenarios that supports the inspection and access path planning in post-disaster structures. Utilizing point cloud data acquired from unmanned robots, the system allows for assessment of automatically generated paths, whose computation is based on varying risk factors, in an interactive 3D environment increasing immersion. The incident commander interactively annotates and reevaluates the acquired point cloud based on live feedback. We describe design considerations, technical realization, and discuss the results of an expert evaluation that we conducted to assess our system.

    Ort, förlag, år, upplaga, sidor
    Institute of Electrical and Electronics Engineers Inc., 2014
    Serie
    12th IEEE International Symposium on Safety, Security and Rescue Robotics, SSRR 2014 - Symposium Proceedings
    Nationell ämneskategori
    Elektroteknik och elektronik
    Identifikatorer
    urn:nbn:se:liu:diva-116761 (URN)10.1109/SSRR.2014.7017652 (DOI)2-s2.0-84923174457 (Scopus ID)9781479941995 (ISBN)
    Konferens
    12th IEEE International Symposium on Safety, Security and Rescue Robotics, SSRR 2014
    Tillgänglig från: 2015-04-02 Skapad: 2015-04-02 Senast uppdaterad: 2018-05-21
    5. A Visualization-Based Analysis System for Urban Search & Rescue Mission Planning Support
    Öppna denna publikation i ny flik eller fönster >>A Visualization-Based Analysis System for Urban Search & Rescue Mission Planning Support
    Visa övriga...
    2017 (Engelska)Ingår i: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 36, nr 6, s. 148-159Artikel i tidskrift (Refereegranskat) Published
    Abstract [en]

    We propose a visualization system for incident commanders (ICs) in urban searchandrescue scenarios that supports path planning in post-disaster structures. Utilizing point cloud data acquired from unmanned robots, we provide methods for the assessment of automatically generated paths. As data uncertainty and a priori unknown information make fully automated systems impractical, we present the IC with a set of viable access paths, based on varying risk factors, in a 3D environment combined with visual analysis tools enabling informed decision making and trade-offs. Based on these decisions, a responder is guided along the path by the IC, who can interactively annotate and reevaluate the acquired point cloud and generated paths to react to the dynamics of the situation. We describe visualization design considerations for our system and decision support systems in general, technical realizations of the visualization components, and discuss the results of two qualitative expert evaluation; one online study with nine searchandrescue experts and an eye-tracking study in which four experts used the system on an application case.

    Ort, förlag, år, upplaga, sidor
    WILEY, 2017
    Nyckelord
    urban search and rescue decision support application
    Nationell ämneskategori
    Datavetenskap (datalogi)
    Identifikatorer
    urn:nbn:se:liu:diva-140952 (URN)10.1111/cgf.12869 (DOI)000408634200009 ()
    Anmärkning

    Funding Agencies|Excellence Center at Linkoping and Lund in Information Technology; Swedish e-Science Research Centre; VR grant [2011-4113]

    Tillgänglig från: 2017-09-19 Skapad: 2017-09-19 Senast uppdaterad: 2020-12-22
    6. Visual Verification of Space Weather Ensemble Simulations
    Öppna denna publikation i ny flik eller fönster >>Visual Verification of Space Weather Ensemble Simulations
    Visa övriga...
    2015 (Engelska)Ingår i: 2015 IEEE Scientific Visualization Conference (SciVis), IEEE, 2015, s. 17-24Konferensbidrag, Publicerat paper (Refereegranskat)
    Abstract [en]

    We propose a system to analyze and contextualize simulations of coronal mass ejections. As current simulation techniques require manual input, uncertainty is introduced into the simulation pipeline leading to inaccurate predictions that can be mitigated through ensemble simulations. We provide the space weather analyst with a multi-view system providing visualizations to: 1. compare ensemble members against ground truth measurements, 2. inspect time-dependent information derived from optical flow analysis of satellite images, and 3. combine satellite images with a volumetric rendering of the simulations. This three-tier workflow provides experts with tools to discover correlations between errors in predictions and simulation parameters, thus increasing knowledge about the evolution and propagation of coronal mass ejections that pose a danger to Earth and interplanetary travel

    Ort, förlag, år, upplaga, sidor
    IEEE, 2015
    Nationell ämneskategori
    Datavetenskap (datalogi) Datorseende och robotik (autonoma system)
    Identifikatorer
    urn:nbn:se:liu:diva-128037 (URN)10.1109/SciVis.2015.7429487 (DOI)000380564400003 ()978-1-4673-9785-8 (ISBN)
    Konferens
    2015 IEEE Scientific Visualization Conference
    Tillgänglig från: 2016-05-16 Skapad: 2016-05-16 Senast uppdaterad: 2018-07-19
    7. Dynamic Scene Graph: Enabling Scaling, Positioning, and Navigation in the Universe
    Öppna denna publikation i ny flik eller fönster >>Dynamic Scene Graph: Enabling Scaling, Positioning, and Navigation in the Universe
    Visa övriga...
    2017 (Engelska)Ingår i: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 36, nr 3, s. 459-468Artikel i tidskrift (Refereegranskat) Published
    Abstract [en]

    In this work, we address the challenge of seamlessly visualizing astronomical data exhibiting huge scale differences in distance, size, and resolution. One of the difficulties is accurate, fast, and dynamic positioning and navigation to enable scaling over orders of magnitude, far beyond the precision of floating point arithmetic. To this end we propose a method that utilizes a dynamically assigned frame of reference to provide the highest possible numerical precision for all salient objects in a scene graph. This makes it possible to smoothly navigate and interactively render, for example, surface structures on Mars and the Milky Way simultaneously. Our work is based on an analysis of tracking and quantification of the propagation of precision errors through the computer graphics pipeline using interval arithmetic. Furthermore, we identify sources of precision degradation, leading to incorrect object positions in screen-space and z-fighting. Our proposed method operates without near and far planes while maintaining high depth precision through the use of floating point depth buffers. By providing interoperability with order-independent transparency algorithms, direct volume rendering, and stereoscopy, our approach is well suited for scientific visualization. We provide the mathematical background, a thorough description of the method, and a reference implementation.

    Ort, förlag, år, upplaga, sidor
    WILEY, 2017
    Nationell ämneskategori
    Datorseende och robotik (autonoma system)
    Identifikatorer
    urn:nbn:se:liu:diva-139628 (URN)10.1111/cgf.13202 (DOI)000404881200042 ()
    Konferens
    19th Eurographics/IEEE VGTC Conference on Visualization (EuroVis)
    Anmärkning

    Funding Agencies|Swedish e-Science Research Center (SeRC); NASA [NNX16AB93A]; Moore-Sloan Data Science Environment at NYU; NSF [CNS-1229185, CCF-1533564, CNS-1544753]

    Tillgänglig från: 2017-08-16 Skapad: 2017-08-16 Senast uppdaterad: 2018-05-21
    8. Globe Browsing: Contextualized Spatio-Temporal Planetary Surface Visualization
    Öppna denna publikation i ny flik eller fönster >>Globe Browsing: Contextualized Spatio-Temporal Planetary Surface Visualization
    Visa övriga...
    2018 (Engelska)Ingår i: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 24, nr 1, s. 802-811Artikel i tidskrift (Refereegranskat) Published
    Abstract [en]

    Results of planetary mapping are often shared openly for use in scientific research and mission planning. In its raw format, however, the data is not accessible to non-experts due to the difficulty in grasping the context and the intricate acquisition process. We present work on tailoring and integration of multiple data processing and visualization methods to interactively contextualize geospatial surface data of celestial bodies for use in science communication. As our approach handles dynamic data sources, streamed from online repositories, we are significantly shortening the time between discovery and dissemination of data and results. We describe the image acquisition pipeline, the pre-processing steps to derive a 2.5D terrain, and a chunked level-of-detail, out-of-core rendering approach to enable interactive exploration of global maps and high-resolution digital terrain models. The results are demonstrated for three different celestial bodies. The first case addresses high-resolution map data on the surface of Mars. A second case is showing dynamic processes. such as concurrent weather conditions on Earth that require temporal datasets. As a final example we use data from the New Horizons spacecraft which acquired images during a single flyby of Pluto. We visualize the acquisition process as well as the resulting surface data. Our work has been implemented in the OpenSpace software [8], which enables interactive presentations in a range of environments such as immersive dome theaters. interactive touch tables. and virtual reality headsets.

    Ort, förlag, år, upplaga, sidor
    Institute of Electrical and Electronics Engineers (IEEE), 2018
    Nyckelord
    Astronomical visualization; globe rendering; public dissemination. science communication; space mission visualization
    Nationell ämneskategori
    Annan data- och informationsvetenskap
    Identifikatorer
    urn:nbn:se:liu:diva-144142 (URN)10.1109/TVCG.2017.2743958 (DOI)000418038400079 ()28866505 (PubMedID)2-s2.0-85028711409 (Scopus ID)
    Konferens
    IEEE VIS Conference
    Anmärkning

    Funding Agencies|Knut and Alice Wallenberg Foundation; Swedish e-Science Research Center (SeRC); ELLIIT; Vetenskapsradet [VR-2015-05462]; NASA [NNX16AB93A]; Moore-Sloan Data Science Environment at New York University; NSF [CNS-1229185, CCF-1533564, CNS-1544753, CNS-1730396]

    Tillgänglig från: 2018-01-10 Skapad: 2018-01-10 Senast uppdaterad: 2018-05-21Bibliografiskt granskad
    Ladda ner fulltext (pdf)
    Tailoring visualization applications for tasks and users
    Ladda ner (pdf)
    omslag
    Ladda ner (png)
    presentationsbild
  • 16.
    Bock, Alexander
    et al.
    New York University, USA.
    Doraiswamy,, Harish
    New York University, USA.
    Silva, Claudio
    New York University, USA.
    Summers, Adam
    University of Washington, USA.
    TopoAngler: Interactive Topology-Based Extraction of Fishes2018Ingår i: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 24, nr 1, s. 812-821Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    We present TopoAngler, a visualization framework that enables an interactive user-guided segmentation of fishes contained in a micro-CT scan. The inherent noise in the CT scan coupled with the often disconnected (and sometimes broken) skeletal structure of fishes makes an automatic segmentation of the volume impractical. To overcome this, our framework combines techniques from computational topology with an interactive visual interface, enabling the human-in-the-Ioop to effectively extract fishes from the volume. In the first step, the join tree of the input is used to create a hierarchical segmentation of the volume. Through the use of linked views, the visual interface then allows users to interactively explore this hierarchy, and gather parts of individual fishes into a coherent sub-volume, thus reconstructing entire fishes. Our framework was primarily developed for its application to CT scans of fishes, generated as part of the ScanAllFish project, through close collaboration with their lead scientist. However, we expect it to also be applicable in other biological applications where a single dataset contains multiple specimen; a common routine that is now widely followed in laboratories to increase throughput of expensive CT scanners.

    Ladda ner fulltext (pdf)
    fulltext
  • 17.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap. Linköpings universitet, Tekniska fakulteten.
    Svensson, Åsa
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Kleiner, Alexander
    iRobot, CA USA.
    Lundberg, Jonas
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Ulm University, Germany.
    A Visualization-Based Analysis System for Urban Search & Rescue Mission Planning Support2017Ingår i: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 36, nr 6, s. 148-159Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    We propose a visualization system for incident commanders (ICs) in urban searchandrescue scenarios that supports path planning in post-disaster structures. Utilizing point cloud data acquired from unmanned robots, we provide methods for the assessment of automatically generated paths. As data uncertainty and a priori unknown information make fully automated systems impractical, we present the IC with a set of viable access paths, based on varying risk factors, in a 3D environment combined with visual analysis tools enabling informed decision making and trade-offs. Based on these decisions, a responder is guided along the path by the IC, who can interactively annotate and reevaluate the acquired point cloud and generated paths to react to the dynamics of the situation. We describe visualization design considerations for our system and decision support systems in general, technical realizations of the visualization components, and discuss the results of two qualitative expert evaluation; one online study with nine searchandrescue experts and an eye-tracking study in which four experts used the system on an application case.

    Ladda ner fulltext (pdf)
    fulltext
  • 18.
    Axelsson, Emil
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Costa, Jonathas
    NYU, NY 10003 USA.
    Silva, Claudio
    NYU, NY 10003 USA.
    Emmart, Carter
    Amer Museum Nat Hist, NY 10024 USA.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap. Linköpings universitet, Tekniska fakulteten.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV.
    Dynamic Scene Graph: Enabling Scaling, Positioning, and Navigation in the Universe2017Ingår i: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 36, nr 3, s. 459-468Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    In this work, we address the challenge of seamlessly visualizing astronomical data exhibiting huge scale differences in distance, size, and resolution. One of the difficulties is accurate, fast, and dynamic positioning and navigation to enable scaling over orders of magnitude, far beyond the precision of floating point arithmetic. To this end we propose a method that utilizes a dynamically assigned frame of reference to provide the highest possible numerical precision for all salient objects in a scene graph. This makes it possible to smoothly navigate and interactively render, for example, surface structures on Mars and the Milky Way simultaneously. Our work is based on an analysis of tracking and quantification of the propagation of precision errors through the computer graphics pipeline using interval arithmetic. Furthermore, we identify sources of precision degradation, leading to incorrect object positions in screen-space and z-fighting. Our proposed method operates without near and far planes while maintaining high depth precision through the use of floating point depth buffers. By providing interoperability with order-independent transparency algorithms, direct volume rendering, and stereoscopy, our approach is well suited for scientific visualization. We provide the mathematical background, a thorough description of the method, and a reference implementation.

    Ladda ner fulltext (pdf)
    fulltext
  • 19.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. New York University, USA.
    Axelsson, Emil
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Bladin, Karl
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Jonathas, Costa
    New York University, USA.
    Gene, Payne
    University of Utah, USA.
    Matthew, Territo
    University of Utah, USA.
    Kilby, Joakim
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Masha, Kuznetsova
    Community Coordinated Modeling Center, NASA, USA.
    Emmart, Carter
    American Museum of Natural History, USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV.
    OpenSpace: An open-source astrovisualization framework2017Ingår i: Journal of Open Source Software, E-ISSN 2475-9066, Vol. 2, nr 15, artikel-id 281Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    OpenSpace (2017; Bock et al. 2017)is an open source interactive data visualization software designed to visualize the entire known universe and portray our ongoing efforts to investigate the cosmos (Bladin, Karl and Axelsson, Emil and Broberg, Erik and Emmart, Carter and Ljung, Patric and Bock, Alexander and Ynnerman, Anders 2017; Bock, Pembroke, et al. 2015). Bringing the latest techniques from data visualization research to the general public and scientists (Bock, Marcinkowski, et al. 2015), OpenSpace supports interactive presentation of dynamic data from observations, simulations, and space mission planning and operations over a large span of sizes (Axelsson, Emil and Costa, Jonathas and Silva, Cláudio T. and Emmart, Carter and Bock, Alexander and Ynnerman, Anders 2017). The software supports multiple operating systems with an extensible architecture powering high resolution tiled displays, planetarium domes, as well as desktop computers. In addition, OpenSpace enables simultaneous connections across the globe creating opportunity for shared experiences among audiences worldwide.

    Ladda ner fulltext (pdf)
    fulltext
  • 20.
    Lindholm, Stefan
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Falk, Martin
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Sundén, Erik
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV. Linköpings universitet, Tekniska högskolan.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Hybrid Data Visualization Based On Depth Complexity Histogram Analysis2015Ingår i: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 34, nr 1, s. 74-85Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    In many cases, only the combination of geometric and volumetric data sets is able to describe a single phenomenon under observation when visualizing large and complex data. When semi-transparent geometry is present, correct rendering results require sorting of transparent structures. Additional complexity is introduced as the contributions from volumetric data have to be partitioned according to the geometric objects in the scene. The A-buffer, an enhanced framebuffer with additional per-pixel information, has previously been introduced to deal with the complexity caused by transparent objects. In this paper, we present an optimized rendering algorithm for hybrid volume-geometry data based on the A-buffer concept. We propose two novel components for modern GPUs that tailor memory utilization to the depth complexity of individual pixels. The proposed components are compatible with modern A-buffer implementations and yield performance gains of up to eight times compared to existing approaches through reduced allocation and reuse of fast cache memory. We demonstrate the applicability of our approach and its performance with several examples from molecular biology, space weather, and medical visualization containing both, volumetric data and geometric structures.

    Ladda ner fulltext (pdf)
    fulltext
  • 21.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Pembroke, Asher
    NASA Goddard Space Flight Center, Greenbelt, MD, United States.
    Mays, M. Leila
    Catholic University of America, Washington, DC, United States.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV.
    OpenSpace: An Open-Source Framework for Data Visualization and Contextualization2015Konferensbidrag (Refereegranskat)
    Abstract [en]

    We present an open-source software development effort called OpenSpace that is tailored for the dissemination of space-related data visualization. In the current stages of the project, we have focussed on the public dissemination of space missions (Rosetta and New Horizons) as well as the support of space weather forecasting. The presented work will focus on the latter of these foci and elaborate on the efforts that have gone into developing a system that allows the user to assess the accuracy and validity of ENLIL ensemble simulations. It becomes possible to compare the results of ENLIL CME simulations with STEREO and SOHO images using an optical flow algorithm. This allows the user to compare velocities in the volumetric rendering of ENLIL data with the movement of CMEs through the field-of-views of various instruments onboard the space craft. By allowing the user access to these comparisons, new information about the time evolution of CMEs through the interplanetary medium is possible. Additionally, contextualizing this information in three-dimensional rendering scene, allows the analyst and the public to disseminate this data. This dissemination is further improved by the ability to connect multiple instances of the software and, thus, reach a broader audience. In a second step, we plan to combine the two foci of the project to enable the visualization of the SWAP instrument onboard New Horizons in context with a far-reaching ENLIL simulation, thus providing additional information about the solar wind dynamics of the outer solar system. The initial work regarding this plan will be presented.

  • 22.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Marcinkowski, Michal
    Linköpings universitet, Institutionen för teknik och naturvetenskap. Linköpings universitet, Tekniska fakulteten. American Museum of Natural History, New York, USA.
    Kilby, Joakim
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Emmart, Carter
    American Museum of Natural History, New York, USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    OpenSpace: Public Dissemination of Space Mission Profiles2015Ingår i: 2015 IEEE Scientific Visualization Conference (SciVis): Proceedings / [ed] James Ahrens; Huamin Qu; Jos Roerdink, Institute of Electrical and Electronics Engineers (IEEE), 2015, s. 141-142Konferensbidrag (Refereegranskat)
    Abstract [en]

    This work presents a visualization system and its application to space missions. The system allows the public to disseminate the scientific findings of space craft and gain a greater understanding thereof. Instruments field-of-views and their measurements are embedded in an accurate 3 dimensional rendering of the solar system to provide context to past measurements or the planning of future events. We tested our system with NASAs New Horizons at the Pluto Pallooza event in New York and will expose it to the greater public on the upcoming July 14th Pluto flyby.

    Ladda ner fulltext (pdf)
    fulltext
  • 23.
    Dieckmann, Mark Eric
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Ahmed, Hamad
    Centre for Plasma Physics (CPP), Queen's University Belfast, BT7 1NN, Belfast, UK.
    Doria, Domenico
    Centre for Plasma Physics (CPP), Queen's University Belfast, BT7 1NN, Belfast, UK.
    Sarri, Gianluca
    Centre for Plasma Physics (CPP), Queen's University Belfast, BT7 1NN, Belfast, UK.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Borghesi, Marco
    Centre for Plasma Physics (CPP), Queen's University Belfast, BT7 1NN, Belfast, UK.
    Shocks in unmagnetized plasma with a shear flow: Stability and magnetic field generation2015Ingår i: Physics of Plasmas, ISSN 1070-664X, E-ISSN 1089-7674, Vol. 22, nr 7, s. 1-9, artikel-id 072104Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    A pair of curved shocks in a collisionless plasma is examined with a two-dimensional particle-in-cell simulation. The shocks are created by the collision of two electron-ion clouds at a speed that exceeds everywhere the threshold speed for shock formation. A variation of the collision speed along the initially planar collision boundary, which is comparable to the ion acoustic speed, yields a curvature of the shock that increases with time. The spatially varying Mach number of the shocks results in a variation of the downstream density in the direction along the shock boundary. This variation is eventually equilibrated by the thermal diffusion of ions. The pair of shocks is stable for tens of inverse ion plasma frequencies. The angle between the mean flow velocity vector of the inflowing upstream plasma and the shock's electrostatic field increases steadily during this time. The disalignment of both vectors gives rise to a rotational electron flow, which yields the growth of magnetic field patches that are coherent over tens of electron skin depths.

    Ladda ner fulltext (pdf)
    fulltext
  • 24.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Pembroke, Asher
    NASA Goddard Space Flight Center, USA.
    Mays, M. Leila
    NASA Goddard Space Flight Center, USA.
    Rastaetter, Lutz
    NASA Goddard Space Flight Center, USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV.
    Ropinski, Timo
    Ulm University, Germany.
    Visual Verification of Space Weather Ensemble Simulations2015Ingår i: 2015 IEEE Scientific Visualization Conference (SciVis), IEEE, 2015, s. 17-24Konferensbidrag (Refereegranskat)
    Abstract [en]

    We propose a system to analyze and contextualize simulations of coronal mass ejections. As current simulation techniques require manual input, uncertainty is introduced into the simulation pipeline leading to inaccurate predictions that can be mitigated through ensemble simulations. We provide the space weather analyst with a multi-view system providing visualizations to: 1. compare ensemble members against ground truth measurements, 2. inspect time-dependent information derived from optical flow analysis of satellite images, and 3. combine satellite images with a volumetric rendering of the simulations. This three-tier workflow provides experts with tools to discover correlations between errors in predictions and simulation parameters, thus increasing knowledge about the evolution and propagation of coronal mass ejections that pose a danger to Earth and interplanetary travel

  • 25.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Kleiner, A.
    IRobotPasadena, CA, United States.
    Lundberg, Jonas
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    An interactive visualization system for urban search & rescue mission planning2014Ingår i: 12th IEEE International Symposium on Safety, Security and Rescue Robotics, SSRR 2014 - Symposium Proceedings, Institute of Electrical and Electronics Engineers Inc. , 2014, nr 7017652Konferensbidrag (Refereegranskat)
    Abstract [en]

    We present a visualization system for incident commanders in urban search and rescue scenarios that supports the inspection and access path planning in post-disaster structures. Utilizing point cloud data acquired from unmanned robots, the system allows for assessment of automatically generated paths, whose computation is based on varying risk factors, in an interactive 3D environment increasing immersion. The incident commander interactively annotates and reevaluates the acquired point cloud based on live feedback. We describe design considerations, technical realization, and discuss the results of an expert evaluation that we conducted to assess our system.

  • 26.
    Sundén, Erik
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Jönsson, Daniel
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Interaction Techniques as a Communication Channel when Presenting 3D Visualizations2014Konferensbidrag (Refereegranskat)
    Abstract [en]

    In this position paper we discuss the usage of various interaction technologies with focus on the presentations of 3D visualizations involving a presenter and an audience. While an interaction technique is commonly evaluated from a user perspective, we want to shift the focus from a sole analysis of the naturalness and the ease-of-use for the user, to focus on how expressive and understandable the interaction technique is when witnessed by the audience. The interaction process itself can be considered to be a communication channel and a more expressive interaction technique might make it easier for the audience to comprehend the presentation. Thus, while some natural interaction techniques for interactive visualization are easy to perform by the presenter, they may be less beneficial when interacting with the visualization in front of (and for) an audience. Our observations indicate that the suitability of an interaction technique as a communication channel is highly dependent on the setting in which the interaction takes place. Therefore, we analyze different presentation scenarios in an exemplary fashion and discuss how beneficial and comprehensive the involved techniques are for the audience. We argue that interaction techniques complement the visualization in an interactive presentation scenario as they also serve as an important communication channel, and should therefore also be observed from an audience perspective rather than exclusively a user perspective.

    Ladda ner fulltext (pdf)
    fulltext
  • 27.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Kleiner, Alexander
    Linköpings universitet, Institutionen för datavetenskap, Artificiell intelligens och integrerade datorsystem. Linköpings universitet, Tekniska högskolan.
    Lundberg, Jonas
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Supporting Urban Search & Rescue Mission Planning through Visualization-Based Analysis2014Ingår i: Proceedings of the Vision, Modeling, and Visualization Conference 2014, Eurographics - European Association for Computer Graphics, 2014Konferensbidrag (Refereegranskat)
    Abstract [en]

    We propose a visualization system for incident commanders in urban search~\&~rescue scenarios that supports access path planning for post-disaster structures. Utilizing point cloud data acquired from unmanned robots, we provide methods for assessment of automatically generated paths. As data uncertainty and a priori unknown information make fully automated systems impractical, we present a set of viable access paths, based on varying risk factors, in a 3D environment combined with the visual analysis tools enabling informed decisions and trade-offs. Based on these decisions, a responder is guided along the path by the incident commander, who can interactively annotate and reevaluate the acquired point cloud to react to the dynamics of the situation. We describe design considerations for our system, technical realizations, and discuss the results of an expert evaluation.

    Ladda ner fulltext (pdf)
    fulltext
  • 28.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Mays, M. Leila
    NASA Goddard Space Flight Center, Greenbelt, MD, USA.
    Rastaetter, Lutz
    NASA Goddard Space Flight Center, Greenbelt, MD, USA.
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten. Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    VCMass: A Framework for Verification of Coronal Mass Ejection Ensemble Simulations2014Konferensbidrag (Refereegranskat)
    Abstract [en]

    Supporting the growing field of space weather forecasting, we propose a framework to analyze ensemble simulations of coronal mass ejections. As the current simulation technique requires manual input, uncertainty is introduced into the simulation pipeline which leads to inaccurate predictions. Using our system, the analyst can compare ensemble members against ground truth data (arrival time and geo-effectivity) as well as information derived from satellite imagery. The simulations can be compared on a global basis, based on time-resolved quality measures, and as a 3D volumetric rendering with embedded satellite imagery in a multi-view setup. This flexible framework provides the expert with the tools to increase the knowledge about the, as of yet not fully understood, principles behind the formation of coronal mass ejections.

    Ladda ner fulltext (pdf)
    fulltext
    Ladda ner fulltext (pdf)
    fulltext
  • 29.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Lang, Norbert
    St. Barbara Hospital, Hamm, Germany.
    Evangelista, Gianpaolo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Lehrke, Ralph
    St. Barbara Hospital, Hamm, Germany.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Guiding Deep Brain Stimulation Interventions by Fusing Multimodal Uncertainty Regions2013Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    Deep Brain Stimulation (DBS) is a surgical intervention that is known to reduce or eliminate the symptoms of common movement disorders, such as Parkinson.s disease, dystonia, or tremor. During the intervention the surgeon places electrodes inside of the patient.s brain to stimulate speci.c regions. Since these regions span only a couple of millimeters, and electrode misplacement has severe consequences, reliable and accurate navigation is of great importance. Usually the surgeon relies on fused CT and MRI data sets, as well as direct feedback from the patient. More recently Microelectrode Recordings (MER), which support navigation by measuring the electric .eld of the patient.s brain, are also used. We propose a visualization system that fuses the different modalities: imaging data, MER and patient checks, as well as the related uncertainties, in an intuitive way to present placement-related information in a consistent view with the goal of supporting the surgeon in the .nal placement of the stimulating electrode. We will describe the design considerations for our system, the technical realization, present the outcome of the proposed system, and provide an evaluation.

    Ladda ner fulltext (pdf)
    paper
  • 30.
    Lindholm, Stefan
    et al.
    Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV. Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Poor Man’s Rendering Of Segmented Data2013Ingår i: Proceedings of SIGRAD 2013; Visual Computing, June 13-14; 2013, Norrköping, Sweden / [ed] Timo Ropinski and Jonas Unger, Linköping: Linköping University Electronic Press, 2013, s. 49-54Konferensbidrag (Refereegranskat)
    Abstract [en]

    In this paper we present a set of techniques for fast and efficient rendering of segmented data. Our approach utilizes the expected difference between two co-located texture lookups of a label volume, taken with different interpolation filters, as a feature boundary indicator. This allows us to achieve smooth class boundaries without needing to explicitly sample all eight neighbors in the label volume as is the case with previous methods. We also present a data encoding scheme that greatly simplifies transfer function construction.

    Ladda ner fulltext (pdf)
    Poor Man’s Rendering Of Segmented Data
  • 31.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Sundén, Erik
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Liu, Bingchen
    University of Auckland, New Zealand .
    Wuensche, Burkhard
    University of Auckland, New Zealand .
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Coherency-Based Curve Compression for High-Order Finite Element Model Visualization2012Ingår i: IEEE Transactions on Visualization and Computer Graphics, ISSN 1077-2626, E-ISSN 1941-0506, Vol. 18, nr 12, s. 2315-2324Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Finite element (FE) models are frequently used in engineering and life sciences within time-consuming simulations. In contrast with the regular grid structure facilitated by volumetric data sets, as used in medicine or geosciences, FE models are defined over a non-uniform grid. Elements can have curved faces and their interior can be defined through high-order basis functions, which pose additional challenges when visualizing these models. During ray-casting, the uniformly distributed sample points along each viewing ray must be transformed into the material space defined within each element. The computational complexity of this transformation makes a straightforward approach inadequate for interactive data exploration. In this paper, we introduce a novel coherency-based method which supports the interactive exploration of FE models by decoupling the expensive world-to-material space transformation from the rendering stage, thereby allowing it to be performed within a precomputation stage. Therefore, our approach computes view-independent proxy rays in material space, which are clustered to facilitate data reduction. During rendering, these proxy rays are accessed, and it becomes possible to visually analyze high-order FE models at interactive frame rates, even when they are time-varying or consist of multiple modalities. Within this paper, we provide the necessary background about the FE data, describe our decoupling method, and introduce our interactive rendering algorithm. Furthermore, we provide visual results and analyze the error introduced by the presented approach.

  • 32.
    Nguyen, Khoa
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Ynnerman, Anders
    Linköpings universitet, Centrum för medicinsk bildvetenskap och visualisering, CMIV. Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Deriving and Visualizing Uncertainty in Kinetic PET Modeling2012Ingår i: Eurographics Workshop on Visual Computing for Biology and Medicine, 2012 / [ed] Timo Ropinski and Anders Ynnerman and Charl Botha and Jos Roerdink, The Eurographics Association , 2012, s. 107-114Konferensbidrag (Refereegranskat)
    Abstract [en]

    Kinetic modeling is the tool of choice when developing new positron emission tomography (PET) tracers for quantitative functional analysis. Several approaches are widely used to facilitate this process. While all these approaches are inherently different, they are still subject to uncertainty arising from various stages of the modeling process. In this paper we propose a novel approach for deriving and visualizing uncertainty in kinetic PET modeling. We distinguish between intra- and inter-model uncertainties. While intra-model uncertainty allows us to derive uncertainty based on a single modeling approach, inter-model uncertainty arises from the differences of the results of different approaches. To derive intra-model uncertainty we exploit the covariance matrix analysis. The inter-model uncertainty is derived by comparing the outcome of three standard kinetic PET modeling approaches. We derive and visualize this uncertainty to exploit it as a basis for changing model input parameters with the ultimate goal to reduce the modeling uncertainty and thus obtain a more realistic model of the tracer under investigation. To support this uncertainty reduction process, we visually link abstract and spatial data by introducing a novel visualization approach based on the ThemeRiver metaphor, which has been modified to support the uncertainty-aware visualization of parameter changes between spatial locations. We have investigated the benefits of the presented concepts by conducting an evaluation with domain experts.

    Ladda ner fulltext (pdf)
    uncertaintyKineticPET
  • 33.
    Liu, Bingchen
    et al.
    University of Auckland, New Zealand.
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Nash, Martyn
    University of Auckland, New Zealand.
    Nielsen, Poul
    University of Auckland, New Zealand.
    Wünsche, Burkhard
    University of Auckland, New Zealand.
    GPU-Accelerated Direct Volume Rendering of Finite Element Data Sets2012Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    Direct Volume Rendering of Finite Element models is challengingsince the visualisation process is performed in worldcoordinates, whereas data fields are usually defined overthe elements’ material coordinate system. In this paper wepresent a framework for Direct Volume Rendering of FiniteElement models. We present several novel implementationsvisualising Finite Element data directly without requiring resamplinginto world coordinates. We evaluate the methodsusing several biomedical Finite Element models. Our GPUimplementation of ray-casting in material coordinates usingdepth peeling is several orders of magnitude faster than thecorresponding CPU approach, and our new ray interpolationapproach achieves near interactive frame rates for high-orderfinite element models at high resolutions.

    Ladda ner fulltext (pdf)
    GPU-Acc-FEM
  • 34.
    Törnros, Martin
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Berrios, David
    Bock, Alexander
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Emmart, Carter
    Harberts, Robert
    Ynnerman, Anders
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska högskolan.
    Interactive Visualization of Space Weather Data2012Konferensbidrag (Övrigt vetenskapligt)
    Ladda ner fulltext (pdf)
    Poster
  • 35.
    Bock, Alexander
    et al.
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Lang, Norbert
    St. Barbara Hospital, Hamm, Germany.
    Evangelista, Gianpaolo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Lehrke, Ralph
    St. Barbara Hospital, Hamm, Germany.
    Ropinski, Timo
    Linköpings universitet, Institutionen för teknik och naturvetenskap, Medie- och Informationsteknik. Linköpings universitet, Tekniska fakulteten.
    Supporting Deep Brain Stimulation Interventions by Fusing Microelectrode Recordings with Imaging Data2012Konferensbidrag (Refereegranskat)
    Ladda ner fulltext (pdf)
    fulltext
1 - 35 av 35
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf