liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Visual Alignment Accuracy in Head Mounted Optical See-Through AR Displays: Distribution of Head Orientation Noise
Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
NASA Ames Research Center. (Human Systems Interface Division)
2009 (English)In: Proceedings of the Human Factors and Ergonomics Society 53rd Annual Meeting 2009, San Antonio (TX), USA: Human Factors and Ergonomics Society , 2009, 2024-2028 p.Conference paper, Published paper (Refereed)
Abstract [en]

The mitigation of registration errors is a central challenge for improving the usability of AugmentedReality systems. While the technical achievements within tracking and display technology continue toimprove the conditions for good registration, little research is directed towards understanding theuser’s visual alignment performance during the calibration process. This paper reports 12 standingsubjects’ visual alignment performance using an optical see-through head mounted display for viewingdirections varied in azimuth (0°, ±30°, ±60°) and elevation (0°, ±10°). Although viewing direction hasa statistically significant effect on the shape of the distribution, the effect is small and negligible forpractical purposes and can be approximated to a circular distribution with a standard deviation of 0.2°for all viewing directions studied in this paper. In addition to quantifying head aiming accuracy with ahead fixed cursor and illustrating the deteriorating accuracy of boresight calibration with increasingviewing direction extremity, the results are applicable for filter design determining the onset and end ofhead rotation.

Place, publisher, year, edition, pages
San Antonio (TX), USA: Human Factors and Ergonomics Society , 2009. 2024-2028 p.
National Category
Computer Engineering
Identifiers
URN: urn:nbn:se:liu:diva-52854DOI: 10.1177/154193120905302710ISBN: 978-161567623-1 (print)OAI: oai:DiVA.org:liu-52854DiVA: diva2:285663
Conference
53rd Human Factors and Ergonomics Society Annual Meeting 2009, HFES 2009; San Antonio, TX; United States
Available from: 2010-01-12 Created: 2010-01-12 Last updated: 2014-09-04Bibliographically approved
In thesis
1. Pinhole Camera Calibration in the Presence of Human Noise
Open this publication in new window or tab >>Pinhole Camera Calibration in the Presence of Human Noise
2011 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The research work presented in this thesis is concerned with the analysis of the human body as a calibration platform for estimation of a pinhole camera model used in Augmented Reality environments mediated through Optical See-Through Head-Mounted Display. Since the quality of the calibration ultimately depends on a subject’s ability to construct visual alignments, the research effort is initially centered around user studies investigating human-induced noise, such as postural sway and head aiming precision. Knowledge about subject behavior is then applied to a sensitivity analysis in which simulations are used to determine the impact of user noise on camera parameter estimation.

Quantitative evaluation of the calibration procedure is challenging since the current state of the technology does not permit access to the user’s view and measurements in the image plane as seen by the user. In an attempt to circumvent this problem, researchers have previously placed a camera in the eye socket of a mannequin, and performed both calibration and evaluation using the auxiliary signal from the camera. However, such a method does not reflect the impact of human noise during the calibration stage, and the calibration is not transferable to a human as the eyepoint of the mannequin and the intended user may not coincide. The experiments performed in this thesis use human subjects for all stages of calibration and evaluation. Moreover, some of the measurable camera parameters are verified with an external reference, addressing not only calibration precision, but also accuracy.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2011. 113 p.
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1402
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-72055 (URN)978-91-7393-053-6 (ISBN)
Public defence
2011-11-04, Domteatern, Visualiseringscenter C, Kungsgatan 54, Norrköping, 09:30 (English)
Opponent
Supervisors
Available from: 2011-11-14 Created: 2011-11-14 Last updated: 2015-09-22Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full text

Authority records BETA

Axholt, MagnusPeterson, Stephen D.

Search in DiVA

By author/editor
Axholt, MagnusPeterson, Stephen D.
By organisation
Visual Information Technology and Applications (VITA)The Institute of Technology
Computer Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 335 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf