liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Device Registration for 3D Geometry-Based User-Perspective Rendering in Hand-Held Video See-Through Augmented Reality
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
2015 (English)In: AUGMENTED AND VIRTUAL REALITY, AVR 2015, SPRINGER-VERLAG BERLIN , 2015, Vol. 9254, 151-167 p.Conference paper, Published paper (Refereed)
Abstract [en]

User-perspective rendering in Video See-through Augmented Reality (V-AR) creates a view that always shows what is behind the screen, from the users point of view. It is used for better registration between the real and virtual world instead of the traditional device-perspective rendering which displays what the camera sees. There is a small number of approaches towards user-perspective rendering that over all improve the registration between the real world, the video captured from real world that is displayed on the screen and the augmentations. There are still some registration errors that cause misalignment in the user-perspective rendering. One source of error is from the device registration which, based on the used tracking method, can be the misalignment between the camera and the screen and also the tracked frame of reference that the screen and the camera are attached to it. In this paper we first describe a method for the user perspective V-AR based on 3D projective geometry. We then address the device registration problem in user perspective rendering by presenting two methods: First, for estimating the misalignment between the camera and the screen. Second, for estimating the misalignment between the camera and the tracked frame.

Place, publisher, year, edition, pages
SPRINGER-VERLAG BERLIN , 2015. Vol. 9254, 151-167 p.
Series
Lecture Notes in Computer Science, ISSN 0302-9743 (print), 1611-3349 (online) ; 9254
Keyword [en]
Augmented Reality; Video see-through; Dynamic frustum; User-perspective
National Category
Media Engineering
Identifiers
URN: urn:nbn:se:liu:diva-123167DOI: 10.1007/978-3-319-22888-4_12ISI: 000364709300012ISBN: 978-3-319-22888-4; 978-3-319-22887-7 (print)OAI: oai:DiVA.org:liu-123167DiVA: diva2:877371
Conference
2nd International Conference on Augmented and Virtual Reality (SALENTO AVR)
Available from: 2015-12-07 Created: 2015-12-04 Last updated: 2015-12-07

Open Access in DiVA

No full text

Other links

Publisher's full text

Search in DiVA

By author/editor
Samini, AliLundin Palmerius, Karljohan
By organisation
Media and Information TechnologyFaculty of Science & Engineering
Media Engineering

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 77 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf