Sensor Fusion and Calibration of Inertial Sensors, Vision, Ultra-Wideband and GPS
2011 (English)Doctoral thesis, monograph (Other academic)
The usage of inertial sensors has traditionally been confined primarily to the aviation and marine industry due to their associated cost and bulkiness. During the last decade, however, inertial sensors have undergone a rather dramatic reduction in both size and cost with the introduction of MEMS technology. As a result of this trend, inertial sensors have become commonplace for many applications and can even be found in many consumer products, for instance smart phones, cameras and game consoles. Due to the drift inherent in inertial technology, inertial sensors are typically used in combination with aiding sensors to stabilize andimprove the estimates. The need for aiding sensors becomes even more apparent due to the reduced accuracy of MEMS inertial sensors.
This thesis discusses two problems related to using inertial sensors in combination with aiding sensors. The first is the problem of sensor fusion: how to combine the information obtained from the different sensors and obtain a good estimate of position and orientation. The second problem, a prerequisite for sensor fusion, is that of calibration: the sensors themselves have to be calibrated and provide measurement in known units. Furthermore, whenever multiple sensors are combined additional calibration issues arise, since the measurements are seldom acquired in the same physical location and expressed in a common coordinate frame. Sensor fusion and calibration are discussed for the combination of inertial sensors with cameras, UWB or GPS.
Two setups for estimating position and orientation in real-time are presented in this thesis. The first uses inertial sensors in combination with a camera; the second combines inertial sensors with UWB. Tightly coupled sensor fusion algorithms and experiments with performance evaluation are provided. Furthermore, this thesis contains ideas on using an optimization based sensor fusion method for a multi-segment inertial tracking system used for human motion capture as well as a sensor fusion method for combining inertial sensors with a dual GPS receiver.
The above sensor fusion applications give rise to a number of calibration problems. Novel and easy-to-use calibration algorithms have been developed and tested to determine the following parameters: the magnetic field distortion when an IMU containing magnetometers is mounted close to a ferro-magnetic object, the relative position and orientation of a rigidly connected camera and IMU, as well as the clock parameters and receiver positions of an indoor UWB positioning system.
Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press , 2011. , 143 p.
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1368
Sensor fusion, calibration, inertial sensors, vision, UWB, GPS
National CategoryControl Engineering
IdentifiersURN: urn:nbn:se:liu:diva-66184ISBN: 978-91-7393-197-7OAI: oai:DiVA.org:liu-66184DiVA: diva2:417835
2011-06-17, Visionen, Hus B, Campus Valla, Linköpings universitet, Linköping, 10:15 (English)
Bernhardsson, Bo, Professor
Schön, Thomas, Dr.Gustafsson, Fredrik, Professor
ProjectsMATRIS (Markerless real-time Tracking for Augmented Reality Image), a sixth framework programme funded by the European UnionCADICS (Control, Autonomy, and Decision-making in Complex Systems), a Linneaus Center funded by the Swedish Research Council (VR)Strategic Research Center MOVIII, funded by the Swedish Foundation for Strategic Research (SSF)