liu.seSearch for publications in DiVA
Change search
Link to record
Permanent link

Direct link
BETA
Henrysson, Anders
Publications (10 of 18) Show all publications
Henrysson, A. & Andel, M. (2007). Augmented Earth: Towards Ubiquitous AR Messaging. In: 17th International Conference on Artificial Reality and Telexistence: . Paper presented at 17th International Conference on Artificial Reality and Telexistence, Esbjerg, Denmark, 28-30 Nov. 2007 (pp. 197-204). Los Alamitos, CA, USA: IEEE Computer Society
Open this publication in new window or tab >>Augmented Earth: Towards Ubiquitous AR Messaging
2007 (English)In: 17th International Conference on Artificial Reality and Telexistence, Los Alamitos, CA, USA: IEEE Computer Society, 2007, p. 197-204Conference paper, Published paper (Refereed)
Abstract [en]

Present augmented reality systems are isolated islands with no or little capability to receive 3D information from the outside world. In this paper we describe how Google Earth can be used to transform the physical world into an AR mailbox. We demonstrate a system where a mobile phone AR environment can be defined and advertised by a portal, using a simple visual/tangible approach. The focus has been on the ease of use. The current limitations are tracking stability and lack of high quality Google Earth content.

Place, publisher, year, edition, pages
Los Alamitos, CA, USA: IEEE Computer Society, 2007
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-40268 (URN)10.1109/ICAT.2007.48 (DOI)000253933000025 ()52834 (Local ID)0-7695-3056-7 (ISBN)52834 (Archive number)52834 (OAI)
Conference
17th International Conference on Artificial Reality and Telexistence, Esbjerg, Denmark, 28-30 Nov. 2007
Available from: 2009-10-10 Created: 2009-10-10 Last updated: 2014-04-23Bibliographically approved
Henrysson, A. (2007). Bringing Augmented Reality to Mobile Phones. (Doctoral dissertation). New York, USA: ACM
Open this publication in new window or tab >>Bringing Augmented Reality to Mobile Phones
2007 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

With its mixing of real and virtual, Augmented Reality (AR) is a technology that has attracted lots of attention from the science community and is seen as a perfect way to visualize context-related information. Computer generated graphics is presented to the user overlaid and registered with the real world and hence augmenting it. Promising intelligence amplification and higher productivity, AR has been intensively researched over several decades but has yet to reach a broad audience.

This thesis presents efforts in bringing Augmented Reality to mobile phones and thus to the general public. Implementing technologies on limited devices, such as mobile phones, poses a number of challenges that differ from traditional research directions. These include: limited computational resources with little or no possibility to upgrade or add hardware, limited input and output capabilities for interactive 3D graphics. The research presented in this thesis addresses these challenges and makes contributions in the following areas:

Mobile Phone Computer Vision-Based Tracking

The first contribution of thesis has been to migrate computer vision algorithms for tracking the mobile phone camera in a real world reference frame - a key enabling technology for AR. To tackle performance issues, low-level optimized code, using fixed-point algorithms, has been developed.

Mobile Phone 3D Interaction Techniques

Another contribution of this thesis has been to research interaction techniques for manipulating virtual content. This is in part realized by exploiting camera tracking for position-controlled interaction where motion of the device is used as input. Gesture input, made possible by a separate front camera, is another approach that is investigated. The obtained results are not unique to AR and could also be applicable to general mobile 3D graphics.

Novel Single User AR Applications

With short range communication technologies, mobile phones can exchange data not only with other phones but also with an intelligent environment. Data can be obtained for tracking or visualization; displays can be used to render graphics with the tracked mobile phone acting as an interaction device. Work is presented where a mobile phone harvests a sensor-network to use AR to visualize live data in context.

Novel Collaboration AR Applications

One of the most promising areas for mobile phone based AR is enhancing face-to-face computer supported cooperative work. This is because the AR display permits non-verbal cues to be used to a larger extent. In this thesis, face-to-face collaboration has been researched to examine whether AR increases awareness of collaboration partners even on small devices such as mobile phones. User feedback indicates that this is the case, confirming the hypothesis that mobile phones are increasingly able to deliver an AR experience to a large audience.

Place, publisher, year, edition, pages
New York, USA: ACM, 2007
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1145
Keywords
Mobile Phone, Augmented Reality, HCI, Ubiquitous Computing, 3D Interaction, Computer Vision
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:liu:diva-10204 (URN)978-91-85895-43-4 (ISBN)
Public defence
2007-12-14, K2, Kåkenhus, Campus Norrköping, Linköpings universitet, Norrköping, 13:15 (English)
Opponent
Supervisors
Note
On the day of the defence date the status on articles III and VIII was: Accepted.Available from: 2007-11-20 Created: 2007-11-20 Last updated: 2018-01-13
Henrysson, A., Marshall, J. & Billinghurst, M. (2007). Experiments in 3D Interaction for Mobile Phone AR. In: Proceedings of the 5th international conference on Computer graphics and interactive techniques in Australia and Southeast Asia, Perth, Australia: (pp. 187-194). New York: The Association for Computing Machinery, Inc.
Open this publication in new window or tab >>Experiments in 3D Interaction for Mobile Phone AR
2007 (English)In: Proceedings of the 5th international conference on Computer graphics and interactive techniques in Australia and Southeast Asia, Perth, Australia, New York: The Association for Computing Machinery, Inc. , 2007, p. 187-194Chapter in book (Other academic)
Abstract [en]

In this paper we present an evaluation of several different techniques for virtual object positioning and rotation on a mobile phone. We compare gesture input captured by the phone's front camera, to tangible input, keypad interaction and phone tilting in increasingly complex positioning and rotation tasks in an AR context. Usability experiments found that tangible input techniques are best for translation tasks, while keypad input is best for rotation tasks. Implications for the design of mobile phone 3D interfaces are presented as well as directions for future research.

Place, publisher, year, edition, pages
New York: The Association for Computing Machinery, Inc., 2007
Keywords
3D interaction, augmented reality, mobile graphics
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:liu:diva-12745 (URN)10.1145/1321261.1321295 (DOI)978-1-59593-912-8 (ISBN)
Available from: 2007-11-20 Created: 2007-11-20 Last updated: 2018-01-13Bibliographically approved
Olwal, A. & Henrysson, A. (2007). LUMAR: A Hybrid Spatial Display System for 2D and 3D Handheld Augmented Reality. In: 17th International Conference on Artificial Reality and Telexistence (ICAT 2007), Esbjerg, Denmark, 2007: . Paper presented at 17th International Conference on Artificial Reality and Telexistence (ICAT 2007), Esbjerg, Denmark, 2007 (pp. 63-70). Los Alamitos, CA, USA: IEEE Computer Society Press
Open this publication in new window or tab >>LUMAR: A Hybrid Spatial Display System for 2D and 3D Handheld Augmented Reality
2007 (English)In: 17th International Conference on Artificial Reality and Telexistence (ICAT 2007), Esbjerg, Denmark, 2007, Los Alamitos, CA, USA: IEEE Computer Society Press , 2007, p. 63-70Conference paper, Published paper (Other academic)
Abstract [en]

LUMAR is a hybrid system for spatial displays, allowing cell phones to be tracked in 2D and 3D through combined egocentric and exocentric techniques based on the Light-Sense and UMAR frameworks. LUMAR differs from most other spatial display systems based on mobile phones with its three-layered information space. The hybrid spatial display system consists of printed matter that is augmented with context-sensitive, dynamic 2D media when the device is on the surface, and with overlaid 3D visualizations when it is held in mid-air.

Place, publisher, year, edition, pages
Los Alamitos, CA, USA: IEEE Computer Society Press, 2007
Keywords
spatially aware, portable, mobile, handheld, cell, phone, augmented reality, mixed reality, ubiquitous
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:liu:diva-12750 (URN)10.1109/ICAT.2007.13 (DOI)
Conference
17th International Conference on Artificial Reality and Telexistence (ICAT 2007), Esbjerg, Denmark, 2007
Available from: 2007-11-20 Created: 2007-11-20 Last updated: 2018-03-05
Henrysson, A. & Billinghurst, M. (2007). Using a Mobile Phone for 6DOF Mesh Editing. In: Proceedings of the 7th ACM SIGCHI New Zealand Chapter's international Conference on Computer-Human interaction: Design Centered HCI.: (pp. 9-16).
Open this publication in new window or tab >>Using a Mobile Phone for 6DOF Mesh Editing
2007 (English)In: Proceedings of the 7th ACM SIGCHI New Zealand Chapter's international Conference on Computer-Human interaction: Design Centered HCI., 2007, p. 9-16Chapter in book (Other academic)
Abstract [en]

This paper describes how a mobile phone can be used as a six degree of freedom interaction device for 3D mesh editing. Using a video see-through Augmented Reality approach, the mobile phone meets several design guidelines for a natural, easy to learn, 3D human computer interaction device. We have developed a system that allows a user to select one or more vertices in an arbitrary sized polygon mesh and freely translate and rotate them by translating and rotating the device itself. The mesh is registered in 3D and viewed through the device and hence the system provides a unified perception-action space. We present the implementation details and discuss the possible advantages and disadvantages of this approach.

Keywords
3D interfaces, content creation, mobile computer graphics, mobile phone augmented reality
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:liu:diva-12747 (URN)10.1145/1278960.1278962 (DOI)1-59593-473-1 (ISBN)
Available from: 2007-11-20 Created: 2007-11-20 Last updated: 2018-01-13Bibliographically approved
Rauhala, M., Gunnarsson, A.-S., Henrysson, A. & Ynnerman, A. (2006). A Novel Interface to Sensor Networks using Handheld Augmented Reality. In: Proceedings of the 8th Conference on Human-Computer interaction with Mobile Devices and Services, Espoo, Finland: (pp. 145-148).
Open this publication in new window or tab >>A Novel Interface to Sensor Networks using Handheld Augmented Reality
2006 (English)In: Proceedings of the 8th Conference on Human-Computer interaction with Mobile Devices and Services, Espoo, Finland, 2006, p. 145-148Conference paper, Published paper (Other academic)
Abstract [en]

Augmented Reality technology enables a mobile phone to be used as an x-ray tool, visualizing structures and states not visible to the naked eye. In this paper we evaluate a set of techniques used augmenting the world with a visualization of data from a sensor network. Combining virtual and real information introduces challenges as information from the two domains might interfere. We have applied our system to humidity data and present a user study together with feedback from domain experts. The prototype system can be seen as the first step towards a novel tool for inspection of building elements.

Keywords
Algorithms, Design, Human Factors, Measurement, intelligent environments, mobile phone augmented reality, sensor networks, visualization
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-12749 (URN)10.1145/1152215.1152245 (DOI)
Available from: 2007-11-20 Created: 2007-11-20 Last updated: 2015-09-22
Henrysson, A., Billinghurst, M. & Ollila, M. (2006). AR Tennis. In: ACM SIGGRAPH 2006 Sketches,2006 (pp. 13). New York, NY, USA: ACM Press
Open this publication in new window or tab >>AR Tennis
2006 (English)In: ACM SIGGRAPH 2006 Sketches,2006, New York, NY, USA: ACM Press , 2006, p. 13-Conference paper, Published paper (Refereed)
Abstract [en]

    

Place, publisher, year, edition, pages
New York, NY, USA: ACM Press, 2006
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-36174 (URN)30385 (Local ID)30385 (Archive number)30385 (OAI)
Available from: 2009-10-10 Created: 2009-10-10 Last updated: 2011-01-04
Andel, M., Petrovski, A., Henrysson, A. & Ollila, M. (2006). Interactive Collaborative Scene Assembly Using AR on Mobile Phones. In: Artificial Reality and Telexistence, ICAT: (pp. 1008-1017). : Springer
Open this publication in new window or tab >>Interactive Collaborative Scene Assembly Using AR on Mobile Phones
2006 (English)In: Artificial Reality and Telexistence, ICAT, Springer , 2006, p. 1008-1017Conference paper, Published paper (Refereed)
Abstract [en]

In this paper we present and evaluate a platform for interactive collaborative face-to-face Augmented Reality using a distributed scene graph on mobile phones. The results of individual actions are viewed on the screen in real-time on every connected phone. We show how multiple collaborators can use consumer mobile camera phones to furnish a room together in an Augmented Reality environment. We have also presented a user case study to investigate how untrained users adopt this novel technology and to study the collaboration between multiple users. The platform is totally independent of a PC server though it is possible to connect a PC client to be used for high quality visualization on a big screen device such as a projector or a plasma display.

Place, publisher, year, edition, pages
Springer, 2006
Series
Lecture Notes in Computer Science, ISSN 1611-3349 ; 4282
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-12748 (URN)10.1007/11941354_104 (DOI)
Available from: 2007-11-20 Created: 2007-11-20 Last updated: 2009-04-22
Henrysson, A., Ollila, M. & Billinghurst, M. (2006). Mobile Phone Based Augmented Reality (1ed.). In: Michael Haller, Mark Billinghurst, and Bruce Thomas, editors (Ed.), Emerging Technologies of Augmented Reality: interfaces and design (pp. 90-109). Hershey, PA, USA: Idea Group Publishing
Open this publication in new window or tab >>Mobile Phone Based Augmented Reality
2006 (English)In: Emerging Technologies of Augmented Reality: interfaces and design / [ed] Michael Haller, Mark Billinghurst, and Bruce Thomas, editors, Hershey, PA, USA: Idea Group Publishing , 2006, 1, p. 90-109Chapter in book (Other academic)
Abstract [en]

  Although the field of mixed reality has grown significantly over the last decade, there have been few published books about augmented reality, particularly the interface design aspects. Emerging Technologies of Augmented Reality: Interfaces and Design provides a foundation of the main concepts of augmented reality (AR), with a particular emphasis on user interfaces, design, and practical AR techniques, from tracking algorithms to design principles for AR interfaces. Emerging Technologies of Augmented Reality: Interfaces and Design contains comprehensive information focusing on the following topics: technologies that support AR, development environments, interface design and evaluation of applications, and case studies of AR applications

Place, publisher, year, edition, pages
Hershey, PA, USA: Idea Group Publishing, 2006 Edition: 1
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-36167 (URN)30305 (Local ID)1-599-04-066-2 (ISBN)978-1-599-04066-0 (ISBN)30305 (Archive number)30305 (OAI)
Available from: 2009-10-10 Created: 2009-10-10 Last updated: 2013-09-09Bibliographically approved
Billinghurst, M. & Henrysson, A. (2006). Research Directions in Handheld AR. International journal of virtual reality, 5(2), 51-58
Open this publication in new window or tab >>Research Directions in Handheld AR
2006 (English)In: International journal of virtual reality, ISSN 1081-1451, Vol. 5, no 2, p. 51-58Article in journal (Refereed) Published
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-37643 (URN)37046 (Local ID)37046 (Archive number)37046 (OAI)
Available from: 2009-10-10 Created: 2009-10-10 Last updated: 2011-01-04
Organisations

Search in DiVA

Show all publications