liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Vision-based Pose Estimation for Autonomous Indoor Navigation of Micro-scale Unmanned Aircraft Systems
Linköping University, Department of Computer and Information Science, KPLAB - Knowledge Processing Lab. Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, AUTTEK - Autonomous Unmanned Aerial Vehicle Research Group . Linköping University, The Institute of Technology.
Linköping University, Department of Computer and Information Science, KPLAB - Knowledge Processing Lab. Linköping University, The Institute of Technology.
2010 (English)In: Proceedings of the 2010 IEEE International Conference on Robotics and Automation (ICRA), IEEE conference proceedings , 2010, p. 1913-1920Conference paper, Published paper (Refereed)
Abstract [en]

We present a navigation system for autonomous indoor flight of micro-scale Unmanned Aircraft Systems (UAS) which is based on a method for accurate monocular vision pose estimation. The method makes use of low cost artificial landmarks placed in the environment and allows for fully autonomous flight with all computation done on-board a UAS on COTS hardware. We provide a detailed description of all system components along with an accuracy evaluation and a time profiling result for the pose estimation method. Additionally, we show how the system is integrated with an existing micro-scale UAS and provide results of experimental autonomous flight tests. To our knowledge, this system is one of the first to allow for complete closed-loop control and goal-driven navigation of a micro-scale UAS in an indoor setting without requiring connection to any external entities.

Place, publisher, year, edition, pages
IEEE conference proceedings , 2010. p. 1913-1920
Series
Proceedings - IEEE International Conference on Robotics and Automation, ISSN 1050-4729 ; 2010
Keywords [en]
UAV, UAS, UAS indoor navigation
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:liu:diva-60006DOI: 10.1109/ROBOT.2010.5509203ISBN: 978-1-4244-5038-1 (print)OAI: oai:DiVA.org:liu-60006DiVA, id: diva2:354593
Conference
2010 IEEE International Conference on Robotics and Automation, May 3-8, Anchorage, Alaska, USA
Available from: 2010-10-04 Created: 2010-10-04 Last updated: 2018-01-12Bibliographically approved
In thesis
1. Increasing Autonomy of Unmanned Aircraft Systems Through the Use of Imaging Sensors
Open this publication in new window or tab >>Increasing Autonomy of Unmanned Aircraft Systems Through the Use of Imaging Sensors
2011 (English)Licentiate thesis, monograph (Other academic)
Abstract [en]

The range of missions performed by Unmanned Aircraft Systems (UAS) has been steadily growing in the past decades thanks to continued development in several disciplines. The goal of increasing the autonomy of UAS's is widening the range of tasks which can be carried out without, or with minimal, external help. This thesis presents methods for increasing specific aspects of autonomy of UAS's operating both in outdoor and indoor environments where cameras are used as the primary sensors.

First, a method for fusing color and thermal images for object detection, geolocation and tracking for UAS's operating primarily outdoors is presented. Specifically, a method for building saliency maps where human body locations are marked as points of interest is described. Such maps can be used in emergency situations to increase the situational awareness of first responders or a robotic system itself. Additionally, the same method is applied to the problem of vehicle tracking. A generated stream of geographical locations of tracked vehicles increases situational awareness by allowing for qualitative reasoning about, for example, vehicles overtaking, entering or leaving crossings.

Second, two approaches to the UAS indoor localization problem in the absence of GPS-based positioning are presented. Both use cameras as the main sensors and enable autonomous indoor ight and navigation. The first approach takes advantage of cooperation with a ground robot to provide a UAS with its localization information. The second approach uses marker-based visual pose estimation where all computations are done onboard a small-scale aircraft which additionally increases its autonomy by not relying on external computational power.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2011. p. 96
Series
Linköping Studies in Science and Technology. Thesis, ISSN 0280-7971 ; 1510
Keywords
UAV, UAS, UAV autonomy, human-body detection, color-thermal image fusion, vehicle tracking, geolocation, UAV indoor navigation
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:liu:diva-71295 (URN)LiU-Tek-Lic-2011:49 (Local ID)9789173930345 (ISBN)LiU-Tek-Lic-2011:49 (Archive number)LiU-Tek-Lic-2011:49 (OAI)
Presentation
2011-11-04, Alan Turing, Hus E, Campus Valla, Linköpings universitet, Linköping, 13:15 (English)
Opponent
Supervisors
Available from: 2011-11-28 Created: 2011-10-10 Last updated: 2021-11-01Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Rudol, PiotrWzorek, MariuszDoherty, Patrick

Search in DiVA

By author/editor
Rudol, PiotrWzorek, MariuszDoherty, Patrick
By organisation
KPLAB - Knowledge Processing LabThe Institute of TechnologyAUTTEK - Autonomous Unmanned Aerial Vehicle Research Group
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 913 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf