liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Collaborative 3D Reconstruction Using Heterogeneous UAVs: System and Experiments
ETH, Switzerland.
ETH, Switzerland.
Linköping University, Department of Computer and Information Science, Artificial Intelligence and Integrated Computer Systems. Linköping University, Faculty of Science & Engineering.
Linköping University, Department of Computer and Information Science, Artificial Intelligence and Integrated Computer Systems. Linköping University, Faculty of Science & Engineering.
Show others and affiliations
2017 (English)In: 2016 INTERNATIONAL SYMPOSIUM ON EXPERIMENTAL ROBOTICS, SPRINGER INTERNATIONAL PUBLISHING AG , 2017, Vol. 1, p. 43-56Conference paper, Published paper (Refereed)
Abstract [en]

This paper demonstrates how a heterogeneous fleet of unmanned aerial vehicles (UAVs) can support human operators in search and rescue (SaR) scenarios. We describe a fully autonomous delegation framework that interprets the top-level commands of the rescue team and converts them into actions of the UAVs. In particular, the UAVs are requested to autonomously scan a search area and to provide the operator with a consistent georeferenced 3D reconstruction of the environment to increase the environmental awareness and to support critical decision-making. The mission is executed based on the individual platform and sensor capabilities of rotary-and fixed-wing UAVs (RW-UAV and FW-UAV respectively): With the aid of an optical camera, the FW-UAV can generate a sparse point-cloud of a large area in a short amount of time. A LiDAR mounted on the autonomous helicopter is used to refine the visual point-cloud by generating denser point-clouds of specific areas of interest. In this context, we evaluate the performance of point-cloud registration methods to align two maps that were obtained by different sensors. In our validation, we compare classical point-cloud alignment methods to a novel probabilistic data association approach that specifically takes the individual point-cloud densities into consideration.

Place, publisher, year, edition, pages
SPRINGER INTERNATIONAL PUBLISHING AG , 2017. Vol. 1, p. 43-56
Series
Springer Proceedings in Advanced Robotics, ISSN 2511-1256
Keywords [en]
Collaborative UAV mapping missions; Point-cloud generation; Vision-laser point-cloud alignment; Delegation of heterogeneouse agents
National Category
Robotics
Identifiers
URN: urn:nbn:se:liu:diva-144574DOI: 10.1007/978-3-319-50115-4_5ISI: 000418796400005ISBN: 978-3-319-50115-4 (print)ISBN: 978-3-319-50114-7 (electronic)OAI: oai:DiVA.org:liu-144574DiVA, id: diva2:1178264
Conference
15th International Symposium on Experimental Robotics (ISER)
Note

Funding Agencies|European Commissions Seventh Framework Programme (FP7) [285417, 600958]

Available from: 2018-01-29 Created: 2018-01-29 Last updated: 2018-01-29

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Search in DiVA

By author/editor
Conte, GianpaoloDoherty, PatrickRudol, PiotrWzorek, Mariusz
By organisation
Artificial Intelligence and Integrated Computer SystemsFaculty of Science & Engineering
Robotics

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 183 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf