liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Navigation and Mapping for Aerial Vehicles Based on Inertial and Imaging Sensors
Linköping University, The Institute of Technology. Linköping University, Department of Electrical Engineering, Automatic Control.
2013 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Small and medium sized Unmanned Aerial Vehicles (UAV) are today used in military missions, and will in the future find many new application areas such as surveillance for exploration and security. To enable all these foreseen applications, the UAV's have to be cheap and of low weight, which restrict the sensors that can be used for navigation and surveillance. This thesis investigates several aspects of how fusion of navigation and imaging sensors can improve both tasks at a level that would require much more expensive sensors with the traditional approach of separating the navigation system from the applications. The core idea is that vision sensors can support the navigation system by providing odometric information of the motion, while the navigation system can support the vision algorithms, used to map the surrounding environment, to be more efficient. The unified framework for this kind of approach is called  Simultaneous Localisation and Mapping (SLAM) and it will be applied here to inertial sensors, radar and optical camera.

Synthetic Aperture Radar (SAR) uses a radar and the motion of the UAV to provide an image of the microwave reflectivity of the ground. SAR images are a good complement to optical images, giving an all-weather surveillance capability, but they require an accurate navigation system to be focused which is not the case with typical UAV sensors. However, by using the inertial sensors, measuring UAV's motion, and information from the SAR images, measuring how image quality depends on the UAV's motion, both higher navigation accuracy and, consequently, more focused images can be obtained. The fusion of these sensors can be performed in both batch and sequential form. For the first approach, we propose an optimisation formulation of the navigation and focusing problem while the second one results  in a filtering approach. For the optimisation method the measurement of the focus in processed SAR images is performed with the image entropy and with an image matching approach, where SAR images are matched to the map of the area. In the proposed filtering method the motion information is estimated from the raw radar data and it corresponds to the time derivative of the range between UAV and the imaged scene, which can be related to the motion of the UAV.

Another imaging sensor that has been exploited in this framework is  an ordinary optical camera. Similar to the SAR case, camera images and inertial sensors can also be used to support the navigation estimate and simultaneously build a three-dimensional map of the observed environment, so called inertial/visual SLAM. Also here, the problem is posed in optimisation framework leading to batch Maximum Likelihood (ML) estimate of the navigation parameters and the map. The ML problem is solved in both the straight-forward way,  resulting in nonlinear least squares where both map and navigation parameters are considered as parameters, and with the Expectation-Maximisation (EM) approach. In the EM approach, all unknown variables are split into two sets, hidden variables and actual parameters, and in this case the map is considered as parameters and the navigation states are seen as hidden  variables. This split enables the total problem to be solved computationally cheaper then the original ML formulation. Both optimisation problems mentioned above are nonlinear and non-convex requiring good initial solution in order to obtain good parameter estimate. For this purpose a method for initialisation of inertial/visual SLAM is devised where the conditional linear structure of the problem is used to obtain the initial estimate of the parameters. The benefits and performance improvements of the methods are illustrated on both simulated and real data.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2013. , 64 p.
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1533
National Category
Control Engineering
Identifiers
URN: urn:nbn:se:liu:diva-97317DOI: 10.3384/diss.diva-97317ISBN: 978-91-7519-553-7 (print)OAI: oai:DiVA.org:liu-97317DiVA: diva2:646587
Public defence
2013-10-18, Visionen, B-huset, Campus Valla, Linköpings universitet, Linköping, 10:15 (English)
Opponent
Supervisors
Available from: 2013-09-30 Created: 2013-09-09 Last updated: 2017-01-19Bibliographically approved
List of papers
1. Simultaneous Navigation and Synthetic Aperture Radar Focusing
Open this publication in new window or tab >>Simultaneous Navigation and Synthetic Aperture Radar Focusing
2013 (English)Report (Other academic)
Abstract [en]

Synthetic Aperture Radar (SAR) equipment is a radar imaging system that can be used to create high resolution images of a scene by utilising the movement of a flying platform. Knowledge of the platform's trajectory is essential to get good and focused images. An emerging application field is real-time SAR imaging using small and cheap platforms with poorer navigation systems implying unfocused images. This contribution investigatesa joint estimation of the trajectory and SAR image.

Publisher
14 p.
Series
LiTH-ISY-R, ISSN 1400-3902 ; 3063
Keyword
Optimisation, navigation, Synthetic Aperture Radar, auto-focusing
National Category
Signal Processing
Identifiers
urn:nbn:se:liu:diva-93925 (URN)LiTH-ISY-R-3063 (ISRN)
Available from: 2013-06-12 Created: 2013-06-12 Last updated: 2014-09-16Bibliographically approved
2. Navigation and SAR focusing with Map Aiding
Open this publication in new window or tab >>Navigation and SAR focusing with Map Aiding
2015 (English)In: IEEE Transactions on Aerospace and Electronic Systems, ISSN 0018-9251, E-ISSN 1557-9603, Vol. 51, no 3, 1652-1663 p.Article in journal (Refereed) Published
Abstract [en]

A method for fusing Synthetic Aperture Radar (SAR) images with opticalaerial images is presented. This is done in a navigation framework, where the absolute position and orientation of the flying platform, as computed from the inertial navigation system, is corrected based on the aerial image coordinates taken as ground truth. The method is suitable for new low-price SAR systems for small unmanned vehicles. The primary application is remote sensing, where the SAR image provides one further "colour" channel revealing reflectivity to radio waves. The method is based on first applying an edge detection algorithm to the images and then optimising the most important navigation states by matching the two binary images. To get a measure of the estimation uncertainty, we embed the optimisation in a least squares framework, where an explicit method to estimate the (relative) size of the errors is presented. The performance is demonstrated on real SAR and aerial images, leading to an error of only a few pixels.

Place, publisher, year, edition, pages
IEEE Press, 2015
Keyword
Optimisation, navigation, Synthetic Aperture Radar, image matching, auto-focusing
National Category
Signal Processing
Identifiers
urn:nbn:se:liu:diva-97280 (URN)10.1109/TAES.2015.130397 (DOI)000362015800006 ()
Note

Funding text: Industry Excellence Center, Linkoping Center for Sensor Informatics and Control (LINK-SIC)

Vid tiden för disputation förelåg publikationen endast endast som manuskript

Available from: 2013-09-05 Created: 2013-09-05 Last updated: 2017-12-06
3. Navigation and SAR Auto-focusing Based on the Phase Gradient Approach
Open this publication in new window or tab >>Navigation and SAR Auto-focusing Based on the Phase Gradient Approach
2011 (English)In: Proceedings of the 14th International Conference on Information Fusion (FUSION), 2011, IEEE conference proceedings, 2011, 1-8 p.Conference paper, Published paper (Refereed)
Abstract [en]

Synthetic Aperture Radar (SAR) equipment is an all-weather radar imaging system that can be used to create high resolution images of the scene by utilising the movement of the flying platform. It is therefore essential to accurately estimate the platform's trajectory in order to get good and focused images. Recently, both real time applications and smaller and cheaper platforms have been considered. This, in turn, leads to unfocused images since cheaper platforms, in general, have navigation systems with poorer performance. At the same time the radar data contain information about the platform's motion that can be used to estimate the trajectory andget more focused images. Here, a method of utilising the phase gradient of the SAR data in a sensor fusion framework is presented. The method is illustrated on a simulated example with promising results. At the end a discussion about the obtained results and future work is covered.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2011
Keyword
Extended Kalman filtering, navigation, Synthetic Aperture Radar auto-focusing, Phase gradient
National Category
Signal Processing Control Engineering
Identifiers
urn:nbn:se:liu:diva-68151 (URN)978-1-4577-0267-9 (ISBN)
Conference
14th International Conference on Information Fusion, Chicago, IL, USA, 5-8 July, 2011
Projects
LINK-SIC
Available from: 2011-05-12 Created: 2011-05-12 Last updated: 2013-09-30
4. A Nonlinear Least-Squares Approach to the SLAM Problem
Open this publication in new window or tab >>A Nonlinear Least-Squares Approach to the SLAM Problem
2011 (English)In: Proceedings of the 18th IFAC World Congress, 2011: World Congress, Volume # 18, Part 1 / [ed] Sergio Bittanti, Angelo Cenedese and Sandro Zampieri, IFAC Papers Online, 2011, 4759-4764 p.Conference paper, Published paper (Refereed)
Abstract [en]

In this paper we present a solution to the simultaneous localisation and mapping (SLAM) problem using a camera and inertial sensors. Our approach is based on the maximum a posteriori (MAP) estimate of the complete SLAM problem. The resulting problem is posed in a nonlinear least-squares framework which we solve with the Gauss-Newton method. The proposed algorithm is evaluated on experimental data using a sensor platform mounted on an industrial robot. In this way, accurate ground truth is available, and the results are encouraging.

Place, publisher, year, edition, pages
IFAC Papers Online, 2011
Keyword
Inertial measurement units, Cameras, Smoothing, Dynamic systems, State estimation
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-68857 (URN)10.3182/20110828-6-IT-1002.02042 (DOI)978-3-902661-93-7 (ISBN)
Conference
The 18th IFAC World Congress, 2011, August 28th to Friday September 2nd, Milano, Italy
Available from: 2011-06-08 Created: 2011-06-08 Last updated: 2016-05-03Bibliographically approved
5. Initialisation and Estimation Methods for Batch Optimisation of Inertial/Visual SLAM
Open this publication in new window or tab >>Initialisation and Estimation Methods for Batch Optimisation of Inertial/Visual SLAM
2013 (English)Report (Other academic)
Abstract [en]

Simultaneous Localisation and Mapping (SLAM) denotes the problem of jointly localizing a moving platform and mapping the environment. This work studies the SLAM problem using a combination of inertial sensors, measuring the platform's accelerations and angular velocities, and a monocular camera observing the environment. We formulate the SLAM problem on a nonlinear least squares (NLS) batch form, whose solution provides a smoothed estimate of the motion and map. The NLS problem is highly nonconvex in practice, so a good initial estimate is required. We propose a multi-stage iterative procedure, that utilises the fact that the SLAM problem is linear if the platform's rotations are known. The map is initialised with camera feature detections only, by utilising feature tracking and clustering of  feature tracks. In this way, loop closures are automatically detected. The initialization method and subsequent NLS refinement is demonstrated on both simulated and real data.

Publisher
15 p.
Series
LiTH-ISY-R, ISSN 1400-3902 ; 3065
Keyword
Simultaneous localisation and mapping, optimisation, inertial measurement unit, monocular camera
National Category
Signal Processing
Identifiers
urn:nbn:se:liu:diva-97278 (URN)LiTH-ISY-R-3065 (ISRN)
Available from: 2013-09-09 Created: 2013-09-05 Last updated: 2017-01-19Bibliographically approved
6. EM-SLAM with Inertial/Visual Applications
Open this publication in new window or tab >>EM-SLAM with Inertial/Visual Applications
2017 (English)In: IEEE Transactions on Aerospace and Electronic Systems, ISSN 0018-9251, E-ISSN 1557-9603, Vol. 53, no 1, 273-285 p.Article in journal (Refereed) Published
Abstract [en]

The general Simultaneous Localisation and Mapping (SLAM) problem aims at estimating the state of a moving platform simultaneously with building a map of the local environment. There are essentially three classes of algorithms. EKF- SLAM and FastSLAM solve the problem on-line, while Nonlinear Least Squares (NLS) is a batch method. All of them scales badly with either the state dimension, the map dimension or the batch length. We investigate the EM algorithm for solving a generalized version of the NLS problem. This EM-SLAM algorithm solves two simpler problems iteratively, hence it scales much better with dimensions. The iterations switch between state estimation, where we propose an Extended Rauch-Tung-Striebel smoother, and map estimation, where a quasi-Newton method is suggested. The proposed method is evaluated in real experiments and also in simulations on a platform with a monocular camera attached to an inertial measurement unit. It is demonstrated to produce lower RMSE than with a standard Levenberg-Marquardt solver of NLS problem, at a computational cost that increases considerably slower. 

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2017
Keyword
SLAM, Expectation-Maximisation, Sensor Fu- sion, Computer Vision, Inertial Sensors
National Category
Robotics
Identifiers
urn:nbn:se:liu:diva-110371 (URN)10.1109/TAES.2017.2650118 (DOI)000399934000022 ()
Note

Funding agencies: Vinnova Industry Excellence Center LINK-SIC

Available from: 2014-09-09 Created: 2014-09-09 Last updated: 2017-05-18Bibliographically approved
7. Cellular Network Non-Line-of-Sight Reflector Localisation Based on Synthetic Aperture Radar Methods
Open this publication in new window or tab >>Cellular Network Non-Line-of-Sight Reflector Localisation Based on Synthetic Aperture Radar Methods
2014 (English)In: IEEE Transactions on Antennas and Propagation, ISSN 0018-926X, E-ISSN 1558-2221, Vol. 62, no 4, 2284-2287 p.Article in journal (Refereed) Published
Abstract [en]

The dependence of radio signal propagation on the environment is  well known, and both statistical and deterministic methods have been presented in the literature. Such methods are either based on randomised or actual reflectors of radio signals. In this work, we instead aim at estimating the location of the reflectors based on geo-localised radio channel impulse reponse measurements and using methods from synthetic aperture radar (SAR). Radio channel data measurements from 3GPP E-UTRAN have been used to verify the usefulness of the proposed approach. The obtained images show that  the estimated reflectors are well correlated with the aerial map of the environment. Also, which part of the trajectory contributed to different reflectors have been estimated with promising results.

Place, publisher, year, edition, pages
IEEE Press, 2014
National Category
Signal Processing
Identifiers
urn:nbn:se:liu:diva-97279 (URN)10.1109/TAP.2014.2300531 (DOI)000334744700057 ()
Available from: 2013-09-09 Created: 2013-09-05 Last updated: 2017-12-06

Open Access in DiVA

Navigation and Mapping for Aerial Vehicles Based on Inertial and Imaging Sensors(2427 kB)1669 downloads
File information
File name FULLTEXT01.pdfFile size 2427 kBChecksum SHA-512
9334577bcb749cd5127571b591425123df5bb58ad71afd43f4aaa12c97b068be72eb5a2c98691bcf2b31e5edf8f6b32a1cf63acb34578ba5737b589ceb270cfa
Type fulltextMimetype application/pdf
omslag(891 kB)50 downloads
File information
File name COVER01.pdfFile size 891 kBChecksum SHA-512
f70426680cb50e9f0e48838df7f46f8b07e4bd98eca8dd40a6dcf027fcac83127ee12b2d29ef9fa2c74423f9c81c362cec26d2755f1933ecaea517713f5e68f8
Type coverMimetype application/pdf

Other links

Publisher's full text

Authority records BETA

Sjanic, Zoran

Search in DiVA

By author/editor
Sjanic, Zoran
By organisation
The Institute of TechnologyAutomatic Control
Control Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 1669 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 1359 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf