Hands-Free Interaction with Virtual Information in a Real Environment
2007 (English)In: COGAIN 2007: Gaze-based Creativity, Interacting with Games and On-line Communities,2007, 2007, 55-60 p.Conference paper (Other academic)
In augmented reality (AR) systems, real and virtual objects are merged and aligned relative a real environment, and presented in the field of view of a user. AR applications that give hierarchical instructions to users often require some feedback or acknowledgement from the user in order to move to the next step in the instructions. This feedback should be possible to give quickly and without interruption from the ongoing task. Many different types of interaction techniques have been used in the domain of AR; there are numerous examples of systems that use manual input, gestures and/or speech interfaces (Nilsson & Johansson 2006, Billlinghurst et al 2001, Gandy et al 2005 and Henrysson et al 2007). However, there are situations where speech and gesture may not be appropriate. For instance, during surgical procedures in an operating room the surgeon may have difficulties manually interacting with technical devices because of the need to keep her/his hands sterile. Voice interaction with a system may also not be appropriate due to surrounding noise or filtering problems. There is one modality that can overcome the issues of noisy environments, keeping hands sterile and the need to work with both hands while at the same time trying to interact with a computer or an AR system, and that is the visual modality. The aim of this paper is to present an AR system with an integrated gaze tracker, allowing quick feedback from the user to the system, as well as analysis of the users gaze behaviour.
Place, publisher, year, edition, pages
2007. 55-60 p.
gaze control, interaction, augmented reality
IdentifiersURN: urn:nbn:se:liu:diva-39643Local ID: 50465OAI: oai:DiVA.org:liu-39643DiVA: diva2:260492