liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A Crowdsourcing System for Integrated and Reproducible Evaluation in Scientific Visualization
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. (Scientific Visualization Group)ORCID iD: 0000-0002-5693-2830
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering. (Scientific Visualization Group)
Visual Computing Research Group, Ulm University.
2016 (English)In: 2016 IEEE Pacific Visualization Symposium (PacificVis), IEEE Computer Society, 2016, 40-47 p.Conference paper, Published paper (Refereed)
Abstract [en]

User evaluations have gained increasing importance in visualization research over the past years, as in many cases these evaluations are the only way to support the claims made by visualization researchers. Unfortunately, recent literature reviews show that in comparison to algorithmic performance evaluations, the number of user evaluations is still very low. Reasons for this are the required amount of time to conduct such studies together with the difficulties involved in participant recruitment and result reporting. While it could be shown that the quality of evaluation results and the simplified participant recruitment of crowdsourcing platforms makes this technology a viable alternative to lab experiments when evaluating visualizations, the time for conducting and reporting such evaluations is still very high. In this paper, we propose a software system, which integrates the conduction, the analysis and the reporting of crowdsourced user evaluations directly into the scientific visualization development process. With the proposed system, researchers can conduct and analyze quantitative evaluations on a large scale through an evaluation-centric user interface with only a few mouse clicks. Thus, it becomes possible to perform iterative evaluations during algorithm design, which potentially leads to better results, as compared to the time consuming user evaluations traditionally conducted at the end of the design process. Furthermore, the system is built around a centralized database, which supports an easy reuse of old evaluation designs and the reproduction of old evaluations with new or additional stimuli, which are both driving challenges in scientific visualization research. We will describe the system's design and the considerations made during the design process, and demonstrate the system by conducting three user evaluations, all of which have been published before in the visualization literature.

Place, publisher, year, edition, pages
IEEE Computer Society, 2016. 40-47 p.
National Category
Other Computer and Information Science
Identifiers
URN: urn:nbn:se:liu:diva-128702DOI: 10.1109/PACIFICVIS.2016.7465249ISI: 000386185000006ISBN: 9781509014514 (print)OAI: oai:DiVA.org:liu-128702DiVA: diva2:931644
Conference
Pacific Visualization Symposium (PacificVis. 19-22 April 2016, Taipei, Taiwan
Available from: 2016-05-30 Created: 2016-05-30 Last updated: 2016-11-20Bibliographically approved

Open Access in DiVA

fulltext(5472 kB)100 downloads
File information
File name FULLTEXT01.pdfFile size 5472 kBChecksum SHA-512
4dd2f8688b74cfecfaed3a85d056ac36e0333f6e1607b66ac2b72010a355c8b391f3781fbd3d6f5431171f2d99523564ed989b0a4d0db3910ea5ec4b9287aa26
Type fulltextMimetype application/pdf
Appendix(2373 kB)16 downloads
File information
File name IMAGE01.pdfFile size 2373 kBChecksum SHA-512
11d43690d63a13f94f1860662a3c45290015b74830c031b7513ef20963d916f8818a41d26f1d0095671e5b6ef33f1368c74cc5194190f2d01bab6b0eeeda0976
Type imageMimetype application/pdf

Other links

Publisher's full text

Authority records BETA

Englund, RickardKottravel, Sathish

Search in DiVA

By author/editor
Englund, RickardKottravel, Sathish
By organisation
Media and Information TechnologyFaculty of Science & Engineering
Other Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 100 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 128 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf