liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Parameter Estimation Variance of the Single Point Active Alignment Method in Optical See-Through Head Mounted Display Calibration
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology. (Visual Interactive Data Analysis)
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
Swedish Defence Research Agency. (Swedish Air Force Combat Simulation Center)
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology. (Visual Interactive Data Analysis)
Show others and affiliations
2011 (English)In: Proceedings of the IEEE Virtual Reality Conference / [ed] Michitaka Hirose, Benjamin Lok, Aditi Majumder and Dieter Schmalstieg, Piscataway, NJ, USA: IEEE , 2011, 27-24 p.Conference paper, Published paper (Refereed)
Abstract [en]

The parameter estimation variance of the Single Point Active Alignment Method (SPAAM) is studied through an experiment where 11 subjects are instructed to create alignments using an Optical See-Through Head Mounted Display (OSTHMD) such that three separate correspondence point distributions are acquired. Modeling the OSTHMD and the subject's dominant eye as a pinhole camera, findings show that a correspondence point distribution well distributed along the user's line of sight yields less variant parameter estimates. The estimated eye point location is studied in particular detail. Thefindings of the experiment are complemented with simulated datawhich show that image plane orientation is sensitive to the numberof correspondence points. The simulated data also illustrates someinteresting properties on the numerical stability of the calibrationproblem as a function of alignment noise, number of correspondencepoints, and correspondence point distribution.

Place, publisher, year, edition, pages
Piscataway, NJ, USA: IEEE , 2011. 27-24 p.
Series
IEEE Virtual Reality Conference, ISSN 1087-8270
Keyword [en]
single point active alignment method, camera resectioning, calibration, optical see-through head mounted display, augmented reality
National Category
Engineering and Technology
Identifiers
URN: urn:nbn:se:liu:diva-67233DOI: 10.1109/VR.2011.5759432ISI: 000297260400004ISBN: 978-1-4577-0037-8 (online), 978-1-4577-0039-2 (print) (print)OAI: oai:DiVA.org:liu-67233DiVA: diva2:408445
Conference
IEEE Virtual Reality Conference, pages 27–34, Singapore, Republic of Singapore
Available from: 2011-04-04 Created: 2011-04-04 Last updated: 2015-09-22Bibliographically approved
In thesis
1. Visual Inertial Navigation and Calibration
Open this publication in new window or tab >>Visual Inertial Navigation and Calibration
2011 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Processing and interpretation of visual content is essential to many systems and applications. This requires knowledge of how the content is sensed and also what is sensed. Such knowledge is captured in models which, depending on the application, can be very advanced or simple. An application example is scene reconstruction using a camera; if a suitable model of the camera is known, then a model of the scene can be estimated from images acquired at different, unknown, locations, yet, the quality of the scene model depends on the quality of the camera model. The opposite is to estimate the camera model and the unknown locations using a known scene model. In this work, two such problems are treated in two rather different applications.

There is an increasing need for navigation solutions less dependent on external navigation systems such as the Global Positioning System (GPS). Simultaneous Localisation and Mapping (slam) provides a solution to this by estimating both navigation states and some properties of the environment without considering any external navigation systems.

The first problem considers visual inertial navigation and mapping using a monocular camera and inertial measurements which is a slam problem. Our aim is to provide improved estimates of the navigation states and a landmark map, given a slam solution. To do this, the measurements are fused in an Extended Kalman Filter (ekf) and then the filtered estimates are used as a starting solution in a nonlinear least-squares problem which is solved using the Gauss-Newton method. This approach is evaluated on experimental data with accurate ground truth for reference.

In Augmented Reality (ar), additional information is superimposed onto the surrounding environment in real time to reinforce our impressions. For this to be a pleasant experience it is necessary to have a good models of the ar system and the environment.

The second problem considers calibration of an Optical See-Through Head Mounted Display system (osthmd), which is a wearable ar system. We show and motivate how the pinhole camera model can be used to represent the osthmd and the user’s eye position. The pinhole camera model is estimated using the Direct Linear Transformation algorithm. Results are evaluated in experiments which also compare different data acquisition methods.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2011. 39 p.
Series
Linköping Studies in Science and Technology. Thesis, ISSN 0280-7971 ; 1500
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-68858 (URN)LiU-TEK-LIC-2011:39 (Local ID)LiU-TEK-LIC-2011:39 (Archive number)LiU-TEK-LIC-2011:39 (OAI)
Presentation
2011-06-16, Visionen, Hus B, Campus Valla, Linköpings universitet, Linköping, 14:30 (English)
Opponent
Supervisors
Available from: 2011-06-08 Created: 2011-06-08 Last updated: 2011-06-13Bibliographically approved
2. Pinhole Camera Calibration in the Presence of Human Noise
Open this publication in new window or tab >>Pinhole Camera Calibration in the Presence of Human Noise
2011 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The research work presented in this thesis is concerned with the analysis of the human body as a calibration platform for estimation of a pinhole camera model used in Augmented Reality environments mediated through Optical See-Through Head-Mounted Display. Since the quality of the calibration ultimately depends on a subject’s ability to construct visual alignments, the research effort is initially centered around user studies investigating human-induced noise, such as postural sway and head aiming precision. Knowledge about subject behavior is then applied to a sensitivity analysis in which simulations are used to determine the impact of user noise on camera parameter estimation.

Quantitative evaluation of the calibration procedure is challenging since the current state of the technology does not permit access to the user’s view and measurements in the image plane as seen by the user. In an attempt to circumvent this problem, researchers have previously placed a camera in the eye socket of a mannequin, and performed both calibration and evaluation using the auxiliary signal from the camera. However, such a method does not reflect the impact of human noise during the calibration stage, and the calibration is not transferable to a human as the eyepoint of the mannequin and the intended user may not coincide. The experiments performed in this thesis use human subjects for all stages of calibration and evaluation. Moreover, some of the measurable camera parameters are verified with an external reference, addressing not only calibration precision, but also accuracy.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2011. 113 p.
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1402
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-72055 (URN)978-91-7393-053-6 (ISBN)
Public defence
2011-11-04, Domteatern, Visualiseringscenter C, Kungsgatan 54, Norrköping, 09:30 (English)
Opponent
Supervisors
Available from: 2011-11-14 Created: 2011-11-14 Last updated: 2015-09-22Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full text

Authority records BETA

Axholt, MagnusSkoglund, MartinCooper, MatthewYnnerman, Anders

Search in DiVA

By author/editor
Axholt, MagnusSkoglund, MartinCooper, MatthewYnnerman, Anders
By organisation
Media and Information TechnologyThe Institute of TechnologyAutomatic Control
Engineering and Technology

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 476 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf