liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Accuracy of Eyepoint Estimation in Optical See-Through Head-Mounted Displays Using the Single Point Active Alignment Method
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
Swedish Air Force Combat Simulation Center at the Swedish Defence Research Agency.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
Show others and affiliations
2011 (English)Conference paper, Published paper (Other academic)
Abstract [en]

This paper studies the accuracy of the estimated eyepoint of an Optical See-Through Head-Mounted Display (OST HMD) calibrated using the Single Point Active Alignment Method (SPAAM). Quantitative evaluation of calibration procedures for OST HMDs is complicated as it is currently not possible to share the subject’s view. Temporarily replacing the subject’s eye with a camera during the calibration or evaluation stage has been proposed, but the uncertainty of a correct eyepoint estimation remains. In the experiment reported in this paper, subjects were used for all stages of calibration and the results were verified with a 3D measurement device. The nine participants constructed 25 visual alignments per calibration after which the estimated pinhole camera model was decomposed into its intrinsic and extrinsic parameters using two common methods. Unique to this experiment, compared to previous evaluations, is the measurement device used to cup the subject’s eyeball. It measures the eyepoint location relative to the head tracker, thereby establishing the calibration accuracy of the estimated eyepoint location. As the results on accuracy are expressed as individual pinhole camera parameters, rather than a compounded registration error, this paper complements  previously published work on parameter variance as the former denotes bias and the latter represents noise. Results indicate that the calibrated eyepoint is on average 5 cm away from its measured location and exhibits a vertical bias which potentially causes dipvergence for stereoscopic vision for objects located further away than 5.6 m. Lastly, this paper closes with a discussion on the suitability of the traditional pinhole camera model for OST HMD calibration.

Place, publisher, year, edition, pages
2011.
Keyword [en]
Accuracy, Single Point Active Alignment Method, Visual Alignment, Calibration, Augmented Reality
National Category
Engineering and Technology
Identifiers
URN: urn:nbn:se:liu:diva-72054OAI: oai:DiVA.org:liu-72054DiVA: diva2:456340
Conference
IEEE Virtual Reality Conference 2012, Orange County (CA), USA
Available from: 2011-11-14 Created: 2011-11-14 Last updated: 2015-09-22Bibliographically approved
In thesis
1. Pinhole Camera Calibration in the Presence of Human Noise
Open this publication in new window or tab >>Pinhole Camera Calibration in the Presence of Human Noise
2011 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The research work presented in this thesis is concerned with the analysis of the human body as a calibration platform for estimation of a pinhole camera model used in Augmented Reality environments mediated through Optical See-Through Head-Mounted Display. Since the quality of the calibration ultimately depends on a subject’s ability to construct visual alignments, the research effort is initially centered around user studies investigating human-induced noise, such as postural sway and head aiming precision. Knowledge about subject behavior is then applied to a sensitivity analysis in which simulations are used to determine the impact of user noise on camera parameter estimation.

Quantitative evaluation of the calibration procedure is challenging since the current state of the technology does not permit access to the user’s view and measurements in the image plane as seen by the user. In an attempt to circumvent this problem, researchers have previously placed a camera in the eye socket of a mannequin, and performed both calibration and evaluation using the auxiliary signal from the camera. However, such a method does not reflect the impact of human noise during the calibration stage, and the calibration is not transferable to a human as the eyepoint of the mannequin and the intended user may not coincide. The experiments performed in this thesis use human subjects for all stages of calibration and evaluation. Moreover, some of the measurable camera parameters are verified with an external reference, addressing not only calibration precision, but also accuracy.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2011. 113 p.
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1402
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-72055 (URN)978-91-7393-053-6 (ISBN)
Public defence
2011-11-04, Domteatern, Visualiseringscenter C, Kungsgatan 54, Norrköping, 09:30 (English)
Opponent
Supervisors
Available from: 2011-11-14 Created: 2011-11-14 Last updated: 2015-09-22Bibliographically approved

Open Access in DiVA

No full text

Authority records BETA

Axholt, MagnusSkoglund, Martin A.Cooper, Matthew D.Ynnerman, Anders

Search in DiVA

By author/editor
Axholt, MagnusSkoglund, Martin A.Cooper, Matthew D.Ynnerman, Anders
By organisation
Media and Information TechnologyThe Institute of TechnologyAutomatic Control
Engineering and Technology

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 773 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf