liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Hands-Free Interaction with Virtual Information in a Real Environment
Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
2007 (English)In: COGAIN 2007: Gaze-based Creativity, Interacting with Games and On-line Communities,2007, 2007, 55-60 p.Conference paper, Published paper (Other academic)
Abstract [en]

In augmented reality (AR) systems, real and virtual objects are merged and aligned relative a real environment, and presented in the field of view of a user. AR applications that give hierarchical instructions to users often require some feedback or acknowledgement from the user in order to move to the next step in the instructions. This feedback should be possible to give quickly and without interruption from the ongoing task. Many different types of interaction techniques have been used in the domain of AR; there are numerous examples of systems that use manual input, gestures and/or speech interfaces (Nilsson & Johansson 2006, Billlinghurst et al 2001, Gandy et al 2005 and Henrysson et al 2007). However, there are situations where speech and gesture may not be appropriate. For instance, during surgical procedures in an operating room the surgeon may have difficulties manually interacting with technical devices because of the need to keep her/his hands sterile. Voice interaction with a system may also not be appropriate due to surrounding noise or filtering problems. There is one modality that can overcome the issues of noisy environments, keeping hands sterile and the need to work with both hands while at the same time trying to interact with a computer or an AR system, and that is the visual modality. The aim of this paper is to present an AR system with an integrated gaze tracker, allowing quick feedback from the user to the system, as well as analysis of the users gaze behaviour.

Place, publisher, year, edition, pages
2007. 55-60 p.
Keyword [en]
gaze control, interaction, augmented reality
National Category
Computer Science
Identifiers
URN: urn:nbn:se:liu:diva-39643Local ID: 50465OAI: oai:DiVA.org:liu-39643DiVA: diva2:260492
Available from: 2009-10-10 Created: 2009-10-10

Open Access in DiVA

No full text

Authority records BETA

Nilsson, Susanna

Search in DiVA

By author/editor
Nilsson, Susanna
By organisation
The Institute of TechnologyCSELAB - Cognitive Systems Engineering Laboratory
Computer Science

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 44 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf