liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Visual Alignment Precision in Optical See - Through AR Displays: Implications for Potential Accuracy
Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
NASA Ames Research Center. (Human Systems Interface Division)
2009 (English)In: Proceedings of the ACM/IEEE Virtual Reality International Conference, Association for Computing Machinery (ACM), 2009Conference paper, Published paper (Other academic)
Abstract [en]

The quality of visual registration achievable with anoptical see-through head mounted display (HMD)ultimately depends on the user’s targetingprecision. This paper presents design guidelines forcalibration procedures based on measurements ofusers’ head stability during visual alignment withreference targets. Targeting data was collected from12 standing subjects who aligned a head fixedcursor presented in a see-through HMD withbackground targets that varied in azimuth (0°, ±30°,±60°) and elevation (0°, ±10°). Their data showedthat: 1) Both position and orientation data will needto be used to establish calibrations based on nearbyreference targets since eliminating body swayeffects can improve calibration precision by a factorof 16 and eliminate apparent angular anisotropies.2) Compensation for body sway can speed thecalibration by removing the need to wait for thebody sway to abate, and 3) calibration precision canbe less than 2 arcmin even for head directionsrotated up to 60° with respect to the user’s torsoprovided body sway is corrected. Users ofAugmented Reality (AR) applications overlookinglarge distances may avoid the need to correct forbody sway by boresighting on markers at relativelylong distances, >> 10 m. These recommendationscontrast with those for heads up displays using realimages as discussed in previous papers.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2009.
National Category
Computer Engineering
Identifiers
URN: urn:nbn:se:liu:diva-52848OAI: oai:DiVA.org:liu-52848DiVA: diva2:285651
Conference
ACM/IEEE Virtual Reality International Conference, 2009
Available from: 2010-01-12 Created: 2010-01-12 Last updated: 2013-11-05Bibliographically approved
In thesis
1. Pinhole Camera Calibration in the Presence of Human Noise
Open this publication in new window or tab >>Pinhole Camera Calibration in the Presence of Human Noise
2011 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The research work presented in this thesis is concerned with the analysis of the human body as a calibration platform for estimation of a pinhole camera model used in Augmented Reality environments mediated through Optical See-Through Head-Mounted Display. Since the quality of the calibration ultimately depends on a subject’s ability to construct visual alignments, the research effort is initially centered around user studies investigating human-induced noise, such as postural sway and head aiming precision. Knowledge about subject behavior is then applied to a sensitivity analysis in which simulations are used to determine the impact of user noise on camera parameter estimation.

Quantitative evaluation of the calibration procedure is challenging since the current state of the technology does not permit access to the user’s view and measurements in the image plane as seen by the user. In an attempt to circumvent this problem, researchers have previously placed a camera in the eye socket of a mannequin, and performed both calibration and evaluation using the auxiliary signal from the camera. However, such a method does not reflect the impact of human noise during the calibration stage, and the calibration is not transferable to a human as the eyepoint of the mannequin and the intended user may not coincide. The experiments performed in this thesis use human subjects for all stages of calibration and evaluation. Moreover, some of the measurable camera parameters are verified with an external reference, addressing not only calibration precision, but also accuracy.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2011. 113 p.
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1402
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-72055 (URN)978-91-7393-053-6 (ISBN)
Public defence
2011-11-04, Domteatern, Visualiseringscenter C, Kungsgatan 54, Norrköping, 09:30 (English)
Opponent
Supervisors
Available from: 2011-11-14 Created: 2011-11-14 Last updated: 2015-09-22Bibliographically approved

Open Access in DiVA

No full text

Other links

Link to publication

Authority records BETA

Axholt, MagnusPeterson, Stephen D.

Search in DiVA

By author/editor
Axholt, MagnusPeterson, Stephen D.
By organisation
Visual Information Technology and Applications (VITA)The Institute of Technology
Computer Engineering

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 388 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf