liu.seSearch for publications in DiVA
Change search
Link to record
Permanent link

Direct link
BETA
Hol, Jeroen Diederik
Alternative names
Publications (10 of 27) Show all publications
Hol, J. D. (2011). Sensor Fusion and Calibration of Inertial Sensors, Vision, Ultra-Wideband and GPS. (Doctoral dissertation). Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>Sensor Fusion and Calibration of Inertial Sensors, Vision, Ultra-Wideband and GPS
2011 (English)Doctoral thesis, monograph (Other academic)
Abstract [en]

The usage of inertial sensors has traditionally been confined primarily to the aviation and marine industry due to their associated cost and bulkiness. During the last decade, however, inertial sensors have undergone a rather dramatic reduction in both size and cost with the introduction of MEMS technology. As a result of this trend, inertial sensors have become commonplace for many applications and can even be found in many consumer products, for instance smart phones, cameras and game consoles. Due to the drift inherent in inertial technology, inertial sensors are typically used in combination with aiding sensors to stabilize andimprove the estimates. The need for aiding sensors becomes even more apparent due to the reduced accuracy of MEMS inertial sensors.

This thesis discusses two problems related to using inertial sensors in combination with aiding sensors. The first is the problem of sensor fusion: how to combine the information obtained from the different sensors and obtain a good estimate of position and orientation. The second problem, a prerequisite for sensor fusion, is that of calibration: the sensors themselves have to be calibrated and provide measurement in known units. Furthermore, whenever multiple sensors are combined additional calibration issues arise, since the measurements are seldom acquired in the same physical location and expressed in a common coordinate frame. Sensor fusion and calibration are discussed for the combination of inertial sensors with cameras, UWB or GPS.

Two setups for estimating position and orientation in real-time are presented in this thesis. The first uses inertial sensors in combination with a camera; the second combines inertial sensors with UWB. Tightly coupled sensor fusion algorithms and experiments with performance evaluation are provided. Furthermore, this thesis contains ideas on using an optimization based sensor fusion method for a multi-segment inertial tracking system used for human motion capture as well as a sensor fusion method for combining inertial sensors with a dual GPS receiver.

The above sensor fusion applications give rise to a number of calibration problems. Novel and easy-to-use calibration algorithms have been developed and tested to determine the following parameters: the magnetic field distortion when an IMU containing magnetometers is mounted close to a ferro-magnetic object, the relative position and orientation of a rigidly connected camera and IMU, as well as the clock parameters and receiver positions of an indoor UWB positioning system.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2011. p. 143
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1368
Keywords
Sensor fusion, calibration, inertial sensors, vision, UWB, GPS
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-66184 (URN)978-91-7393-197-7 (ISBN)
Public defence
2011-06-17, Visionen, Hus B, Campus Valla, Linköpings universitet, Linköping, 10:15 (English)
Opponent
Supervisors
Projects
MATRIS (Markerless real-time Tracking for Augmented Reality Image), a sixth framework programme funded by the European UnionCADICS (Control, Autonomy, and Decision-making in Complex Systems), a Linneaus Center funded by the Swedish Research Council (VR)Strategic Research Center MOVIII, funded by the Swedish Foundation for Strategic Research (SSF)
Available from: 2011-05-27 Created: 2011-03-07 Last updated: 2013-12-10Bibliographically approved
Hol, J., Schön, T. & Gustafsson, F. (2010). Modeling and Calibration of Inertial and Vision Sensors. The international journal of robotics research, 29(2), 231-244
Open this publication in new window or tab >>Modeling and Calibration of Inertial and Vision Sensors
2010 (English)In: The international journal of robotics research, ISSN 0278-3649, E-ISSN 1741-3176, Vol. 29, no 2, p. 231-244Article in journal (Refereed) Published
Abstract [en]

This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The method is based on a physical model which can also be used in solving, for example, sensor fusion problems. The experimental results show that the method works well in practice, both for perspective and spherical cameras.

Place, publisher, year, edition, pages
Sage Publications, 2010
Keywords
Vision sensors, Inertial sensors, Sensor fusion, Calibration, Gray-box system identification
National Category
Computer and Information Sciences Control Engineering
Identifiers
urn:nbn:se:liu:diva-54230 (URN)10.1177/0278364909356812 (DOI)000275038200007 ()
Available from: 2010-03-03 Created: 2010-03-03 Last updated: 2018-01-12
Hol, J. D., Schön, T. & Gustafsson, F. (2010). Ultra-Wideband Calibration for Indoor Positioning. In: Proceedings of the 2010 IEEE International Conference on Ultra-Wideband: . Paper presented at 2010 IEEE International Conference on Ultra-Wideband, Nanjing, China, 20-23 September, 2010. , 2
Open this publication in new window or tab >>Ultra-Wideband Calibration for Indoor Positioning
2010 (English)In: Proceedings of the 2010 IEEE International Conference on Ultra-Wideband, 2010, Vol. 2Conference paper, Published paper (Refereed)
Abstract [en]

The main contribution of this work is a novel calibration method to determine the clock parameters of the UWB receivers as well as their 3D positions. It exclusively uses time-of-arrival measurements, thereby removing the need for the typically labor-intensive and time-consuming process of surveying the receiver positions. Experiments show that the method is capable of accurately calibrating a UWB setup within minutes.

Keywords
Calibration, Maximum likelihood estimation, Ultra-wideband, Indoor positioning
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-62901 (URN)10.1109/ICUWB.2010.5616867 (DOI)978-1-4244-5306-1 (ISBN)978-1-4244-5305-4 (ISBN)
Conference
2010 IEEE International Conference on Ultra-Wideband, Nanjing, China, 20-23 September, 2010
Projects
CADICS
Funder
Swedish Research Council
Available from: 2010-12-07 Created: 2010-12-07 Last updated: 2013-07-09
Hol, J., Schön, T. & Gustafsson, F. (2009). Modeling and Calibration of Inertial and Vision Sensors. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>Modeling and Calibration of Inertial and Vision Sensors
2009 (English)Report (Other academic)
Abstract [en]

This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The method is based on a physical model which can also be used in solving, for example, sensor fusion problems. The experimental results show that the method works well in practice, both for perspective and spherical cameras.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2009. p. 27
Series
LiTH-ISY-R, ISSN 1400-3902 ; 2916
Keywords
Pose estimation, Sensor fusion, Computer vision, Inertial navigation, Calibration
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-56213 (URN)LiTH-ISY-R-2916 (ISRN)
Available from: 2010-04-30 Created: 2010-04-30 Last updated: 2014-08-11Bibliographically approved
Gustafsson, F., Schön, T. & Hol, J. (2009). Sensor Fusion for Augmented Reality. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>Sensor Fusion for Augmented Reality
2009 (English)Report (Other academic)
Abstract [en]

The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear filtering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a nonlinear filter is described, using a dynamic model for the 22 states, where 100 Hz inertial measurements and 12.5 Hz vision measurements are processed. An example where an industrial robot is used to move the sensor unit, possessing almost perfect precision and repeatability, is presented. The results show that position and orientation accuracy is sufficient for a number of augmented reality applications.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2009. p. 3
Series
LiTH-ISY-R, ISSN 1400-3902 ; 2875
Keywords
Sensor fusion, Nonlinear filtering, Tracking, Kalman filter, Augmented reality
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-56191 (URN)LiTH-ISY-R-2875 (ISRN)
Available from: 2010-04-30 Created: 2010-04-30 Last updated: 2014-08-11Bibliographically approved
Hol, J. ., Dijkstra, F., Luinge, H. & Schön, T. (2009). Tightly Coupled UWB/IMU Pose Estimation. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>Tightly Coupled UWB/IMU Pose Estimation
2009 (English)Report (Other academic)
Abstract [en]

In this paper we propose a 6DOF tracking system combining Ultra-Wideband measurements with low-cost MEMS inertial measurements. A tightly coupled system is developed which estimates position as well as orientation of the sensorunit while being reliable in case of multipath effects and NLOS conditions. The experimental results show robust and continuous tracking in a realistic indoor positioning scenario.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2009. p. 14
Series
LiTH-ISY-R, ISSN 1400-3902 ; 2913
Keywords
Indoor positioning, Kalman Filter, IMU, Ultra-Wideband
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-55840 (URN)LiTH-ISY-R-2913 (ISRN)
Available from: 2010-04-30 Created: 2010-04-30 Last updated: 2014-08-13Bibliographically approved
Hol, J., Schön, T. & Gustafsson, F. (2008). A New Algorithm for Calibrating a Combined Camera and IMU Sensor Unit. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>A New Algorithm for Calibrating a Combined Camera and IMU Sensor Unit
2008 (English)Report (Other academic)
Abstract [en]

This paper is concerned with the problem of estimating the relative translation and orientation between an inertial measurement unit and a camera which are rigidly connected. The key is to realise that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. Furthermore, covariance expressions are provided for all involved estimates. The experimental results shows that the method works well in practice.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2008. p. 17
Series
LiTH-ISY-R, ISSN 1400-3902 ; 2858
Keywords
Gray-box system identification, Kalman filter, Calibration, IMU, Camera
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-56174 (URN)LiTH-ISY-R-2858 (ISRN)
Available from: 2010-04-30 Created: 2010-04-30 Last updated: 2014-08-11Bibliographically approved
Hol, J., Schön, T. & Gustafsson, F. (2008). A New Algorithm for Calibrating a Combined Camera and IMU Sensor Unit. In: Proceedings of the 10th International Conference on Control, Automation, Robotics and Vision: . Paper presented at 10th International Conference on Control, Automation, Robotics and Vision, Hanoi, Vietnam, December, 2008 (pp. 1857-1862).
Open this publication in new window or tab >>A New Algorithm for Calibrating a Combined Camera and IMU Sensor Unit
2008 (English)In: Proceedings of the 10th International Conference on Control, Automation, Robotics and Vision, 2008, p. 1857-1862Conference paper, Published paper (Refereed)
Abstract [en]

This paper is concerned with the problem of estimating the relative translation and orientation between an inertial measurement unit and a camera which are rigidly connected. The key is to realise that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. Furthermore, covariance expressions are provided for all involved estimates. The experimental results shows that the method works well in practice. 

Keywords
Gray-box system identification, IMU, Kalman filter, Calibration, Camera
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-44266 (URN)10.1109/ICARCV.2008.4795810 (DOI)76142 (Local ID)978-1-4244-2287-6 (ISBN)978-1-4244-2286-9 (ISBN)76142 (Archive number)76142 (OAI)
Conference
10th International Conference on Control, Automation, Robotics and Vision, Hanoi, Vietnam, December, 2008
Available from: 2009-10-10 Created: 2009-10-10 Last updated: 2013-09-15
Hol, J. D. (2008). Pose Estimation and Calibration Algorithms for Vision and Inertial Sensors. (Licentiate dissertation). : Institutionen för systemteknik
Open this publication in new window or tab >>Pose Estimation and Calibration Algorithms for Vision and Inertial Sensors
2008 (English)Licentiate thesis, monograph (Other academic)
Abstract [en]

This thesis deals with estimating position and orientation in real-time, using measurements from vision and inertial sensors. A system has been developed to solve this problem in unprepared environments, assuming that a map or scene model is available. Compared to ‘camera-only’ systems, the combination of the complementary sensors yields an accurate and robust system which can handle periods with uninformative or no vision data and reduces the need for high frequency vision updates.

The system achieves real-time pose estimation by fusing vision and inertial sensors using the framework of nonlinear state estimation for which state space models have been developed. The performance of the system has been evaluated using an augmented reality application where the output from the system is used to superimpose virtual graphics on the live video stream. Furthermore, experiments have been performed where an industrial robot providing ground truth data is used to move the sensor unit. In both cases the system performed well.

Calibration of the relative position and orientation of the camera and the inertial sensor turn out to be essential for proper operation of the system. A new and easy-to-use algorithm for estimating these has been developed using a gray-box system identification approach. Experimental results show that the algorithm works well in practice.

Place, publisher, year, edition, pages
Institutionen för systemteknik, 2008. p. 94
Series
Linköping Studies in Science and Technology. Thesis, ISSN 0280-7971 ; 1370
Keywords
Pose estimation, Sensor fusion, Computer vision, Inertial navigation, Calibration
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-11842 (URN)978-91-7393-862-4 (ISBN)
Presentation
2008-05-30, Visionen, House B, SE-581 83, Linköping, 10:15 (English)
Opponent
Supervisors
Available from: 2008-06-10 Created: 2008-06-10 Last updated: 2009-03-10
Hol, J., Schön, T. & Gustafsson, F. (2008). Relative Pose Calibration of a Spherical Camera and an IMU. Linköping: Linköping University Electronic Press
Open this publication in new window or tab >>Relative Pose Calibration of a Spherical Camera and an IMU
2008 (English)Report (Other academic)
Abstract [en]

This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a spherical camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The experimental results show that the method works well in practice.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2008. p. 10
Series
LiTH-ISY-R, ISSN 1400-3902 ; 2855
Keywords
Pose estimation, Sensor fusion, Computer vision, Inertial navigation, Calibration
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-56170 (URN)LiTH-ISY-R-2855 (ISRN)
Available from: 2010-04-30 Created: 2010-04-30 Last updated: 2014-08-12Bibliographically approved
Organisations

Search in DiVA

Show all publications