liu.seSearch for publications in DiVA
Change search
Refine search result
1 - 27 of 27
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Chandaria, Jigna
    et al.
    BBC Research, United Kingdom.
    Thomas, Graham
    BBC Research, United Kingdom.
    Bartczak, Bogumil
    University of Kiel, Germany.
    Koch, Reinhard
    University of Kiel, Germany.
    Becker, Mario
    Fraunhofer IGD, Germany.
    Bleser, Gabriele
    Fraunhofer IGD, Germany.
    Stricker, Didier
    Fraunhofer IGD, Germany.
    Wohlleber, Cedric
    Fraunhofer IGD, Germany.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Felsberg, Michael
    Linköping University, Department of Electrical Engineering, Computer Vision. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Skoglund, Johan
    Linköping University, Department of Electrical Engineering, Computer Vision. Linköping University, The Institute of Technology.
    Slycke, Per
    Xsens, The Netherlands.
    Smeitz, Sebastiaan
    Xsens, The Netherlands.
    Real-Time Camera Tracking in the MATRIS Project2007In: Smpte Journal, ISSN 0036-1682, Vol. 116, no 7-8, p. 266-271Article in journal (Refereed)
    Abstract [en]

    In order to insert a virtual object into a TV image, the graphics system needs to know precisely how the camera is moving, so that the virtual object can be rendered in the correct place in every frame. Nowadays this can be achieved relatively easily in post-production, or in a studio equipped with a special tracking system. However, for live shooting on location, or in a studio that is not specially equipped, installing such a system can be difficult or uneconomic. To overcome these limitations, the MATRIS project is developing a real-time system for measuring the movement of a camera. The system uses image analysis to track naturally occurring features in the scene, and data from an inertial sensor. No additional sensors, special markers, or camera mounts are required. This paper gives an overview of the system and presents some results.

  • 2.
    Chandaria, Jigna
    et al.
    BBC Research, UK.
    Thomas, Graham
    BBC Research, UK.
    Bartczak, Bogumil
    University of Kiel, Germany.
    Koeser, Kevin
    University of Kiel, Germany.
    Koch, Reinhard
    University of Kiel, Germany.
    Becker, Mario
    Fraunhofer IGD, Germany.
    Bleser, Gabriele
    Fraunhofer IGD, Germany.
    Stricker, Didier
    Fraunhofer IGD, Germany.
    Wohlleber, Cedric
    Fraunhofer IGD, Germany.
    Felsberg, Michael
    Linköping University, Department of Electrical Engineering, Computer Vision. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Skoglund, Johan
    Linköping University, The Institute of Technology.
    Slycke, Per
    Xsens, Netherlands.
    Smeitz, Sebastiaan
    Xsens, Netherlands.
    Real-Time Camera Tracking in the MATRIS Project2006In: Prcoeedings of the 2006 International Broadcasting Convention, 2006Conference paper (Refereed)
    Abstract [en]

    In order to insert a virtual object into a TV image, the graphics system needs to know precisely how the camera is moving, so that the virtual object can be rendered in the correct place in every frame. Nowadays this can be achieved relatively easily in postproduction, or in a studio equipped with a special tracking system. However, for live shooting on location, or in a studio that is not specially equipped, installing such a system can be difficult or uneconomic. To overcome these limitations, the MATRIS project is developing a real-time system for measuring the movement of a camera. The system uses image analysis to track naturally occurring features in the scene, and data from an inertial sensor. No additional sensors, special markers, or camera mounts are required. This paper gives an overview of the system and presents some results.  

  • 3.
    Gustafsson, Fredrik
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Sensor Fusion for Augmented Reality2009Report (Other academic)
    Abstract [en]

    The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear filtering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a nonlinear filter is described, using a dynamic model for the 22 states, where 100 Hz inertial measurements and 12.5 Hz vision measurements are processed. An example where an industrial robot is used to move the sensor unit, possessing almost perfect precision and repeatability, is presented. The results show that position and orientation accuracy is sufficient for a number of augmented reality applications.

  • 4.
    Gustafsson, Fredrik
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Sensor Fusion for Augmented Reality2008In: Proceedings of the 17th IFAC World Congress, 2008, p. 14100-14100Conference paper (Refereed)
    Abstract [en]

    The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear filtering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a nonlinear filter is described, using a dynamic model for the 22 states, where 100 Hz inertial measurements and 12.5 Hz vision measurements are processed. An example where an industrial robot is used to move the sensor unit, possessing almost perfect precision and repeatability, is presented. The results show that position and orientation accuracy is sufficient for a number of augmented reality applications.

  • 5.
    Hendeby, Gustaf
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Karlsson, Rickard
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    A Graphics Processing Unit Implementation of the Particle Filter2007In: Proceedings of the 15th European Statistical Signal Processing Conference, European Association for Signal, Speech, and Image Processing , 2007, p. 1639-1643Conference paper (Refereed)
    Abstract [en]

    Modern graphics cards for computers, and especially their graphics processing units (GPUs), are designed for fast rendering of graphics. In order to achieve this GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU) as a complement to the central processing unit (CPU). In this paper GPGPU techniques are used to make a parallel GPU implementation of state-of-the-art recursive Bayesian estimation using particle filters (PF). The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to one achieved with a traditional CPU implementation. The resulting GPU filter is faster with the same accuracy as the CPU filter for many particles, and it shows how the particle filter can be parallelized.

  • 6.
    Hendeby, Gustaf
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Karlsson, Rickard
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    A Graphics Processing Unit Implementation of the Particle Filter2007Report (Other academic)
    Abstract [en]

    Modern graphics cards for computers, and especially their graphics processing units (GPUs), are designed for fast rendering of graphics. In order to achieve this GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU) as a complement to the central processing unit (CPU). In this paper GPGPU techniques are used to make a parallel GPU implementation of state-of-the-art recursive Bayesian estimation using particle filters (PF). The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to one achieved with a traditional CPU implementation. The resulting GPU filter is faster with the same accuracy as the CPU filter for many particles, and it shows how the particle filter can be parallelized.

  • 7.
    Hendeby, Gustaf
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Karlsson, Rickard
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Graphics Processing Unit Implementation of the Particle Filter2006Report (Other academic)
    Abstract [en]

    Modern graphics cards for computers, and especially their graphics processing units (GPUs), are designed for fast rendering of graphics. In order to achieve this GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU) as a complement to the central processing unit (CPU). In this paper GPGPU techniques are used to make a parallel GPU implementation of state-of-the-art recursive Bayesian estimation using particle filters (PF). The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to one achieved with a traditional CPU implementation. The resulting GPU filter is faster with the same accuracy as the CPU filter for many particles, and it shows how the particle filter can be parallelized.

  • 8.
    Hol, Jeroen D.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Sensor Fusion and Calibration of Inertial Sensors, Vision, Ultra-Wideband and GPS2011Doctoral thesis, monograph (Other academic)
    Abstract [en]

    The usage of inertial sensors has traditionally been confined primarily to the aviation and marine industry due to their associated cost and bulkiness. During the last decade, however, inertial sensors have undergone a rather dramatic reduction in both size and cost with the introduction of MEMS technology. As a result of this trend, inertial sensors have become commonplace for many applications and can even be found in many consumer products, for instance smart phones, cameras and game consoles. Due to the drift inherent in inertial technology, inertial sensors are typically used in combination with aiding sensors to stabilize andimprove the estimates. The need for aiding sensors becomes even more apparent due to the reduced accuracy of MEMS inertial sensors.

    This thesis discusses two problems related to using inertial sensors in combination with aiding sensors. The first is the problem of sensor fusion: how to combine the information obtained from the different sensors and obtain a good estimate of position and orientation. The second problem, a prerequisite for sensor fusion, is that of calibration: the sensors themselves have to be calibrated and provide measurement in known units. Furthermore, whenever multiple sensors are combined additional calibration issues arise, since the measurements are seldom acquired in the same physical location and expressed in a common coordinate frame. Sensor fusion and calibration are discussed for the combination of inertial sensors with cameras, UWB or GPS.

    Two setups for estimating position and orientation in real-time are presented in this thesis. The first uses inertial sensors in combination with a camera; the second combines inertial sensors with UWB. Tightly coupled sensor fusion algorithms and experiments with performance evaluation are provided. Furthermore, this thesis contains ideas on using an optimization based sensor fusion method for a multi-segment inertial tracking system used for human motion capture as well as a sensor fusion method for combining inertial sensors with a dual GPS receiver.

    The above sensor fusion applications give rise to a number of calibration problems. Novel and easy-to-use calibration algorithms have been developed and tested to determine the following parameters: the magnetic field distortion when an IMU containing magnetometers is mounted close to a ferro-magnetic object, the relative position and orientation of a rigidly connected camera and IMU, as well as the clock parameters and receiver positions of an indoor UWB positioning system.

  • 9.
    Hol, Jeroen D
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology. Xsens Technologies, The Netherlands.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Ultra-Wideband Calibration for Indoor Positioning2010In: Proceedings of the 2010 IEEE International Conference on Ultra-Wideband, 2010, Vol. 2Conference paper (Refereed)
    Abstract [en]

    The main contribution of this work is a novel calibration method to determine the clock parameters of the UWB receivers as well as their 3D positions. It exclusively uses time-of-arrival measurements, thereby removing the need for the typically labor-intensive and time-consuming process of surveying the receiver positions. Experiments show that the method is capable of accurately calibrating a UWB setup within minutes.

  • 10.
    Hol, Jeroen Diederik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Pose Estimation and Calibration Algorithms for Vision and Inertial Sensors2008Licentiate thesis, monograph (Other academic)
    Abstract [en]

    This thesis deals with estimating position and orientation in real-time, using measurements from vision and inertial sensors. A system has been developed to solve this problem in unprepared environments, assuming that a map or scene model is available. Compared to ‘camera-only’ systems, the combination of the complementary sensors yields an accurate and robust system which can handle periods with uninformative or no vision data and reduces the need for high frequency vision updates.

    The system achieves real-time pose estimation by fusing vision and inertial sensors using the framework of nonlinear state estimation for which state space models have been developed. The performance of the system has been evaluated using an augmented reality application where the output from the system is used to superimpose virtual graphics on the live video stream. Furthermore, experiments have been performed where an industrial robot providing ground truth data is used to move the sensor unit. In both cases the system performed well.

    Calibration of the relative position and orientation of the camera and the inertial sensor turn out to be essential for proper operation of the system. A new and easy-to-use algorithm for estimating these has been developed using a gray-box system identification approach. Experimental results show that the algorithm works well in practice.

  • 11.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    A New Algorithm for Calibrating a Combined Camera and IMU Sensor Unit2008Report (Other academic)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation between an inertial measurement unit and a camera which are rigidly connected. The key is to realise that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. Furthermore, covariance expressions are provided for all involved estimates. The experimental results shows that the method works well in practice.

  • 12.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    A New Algorithm for Calibrating a Combined Camera and IMU Sensor Unit2008In: Proceedings of the 10th International Conference on Control, Automation, Robotics and Vision, 2008, p. 1857-1862Conference paper (Refereed)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation between an inertial measurement unit and a camera which are rigidly connected. The key is to realise that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. Furthermore, covariance expressions are provided for all involved estimates. The experimental results shows that the method works well in practice. 

  • 13.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Modeling and Calibration of Inertial and Vision Sensors2009Report (Other academic)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The method is based on a physical model which can also be used in solving, for example, sensor fusion problems. The experimental results show that the method works well in practice, both for perspective and spherical cameras.

  • 14.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Modeling and Calibration of Inertial and Vision Sensors2010In: The international journal of robotics research, ISSN 0278-3649, E-ISSN 1741-3176, Vol. 29, no 2, p. 231-244Article in journal (Refereed)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The method is based on a physical model which can also be used in solving, for example, sensor fusion problems. The experimental results show that the method works well in practice, both for perspective and spherical cameras.

  • 15.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    On Resampling Algorithms for Particle Filters2006In: Proceedings of the 2006 IEEE Nonlinear Statistical Signal Processing Workshop, 2006, p. 79-82Conference paper (Refereed)
    Abstract [en]

    In this paper a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms with respect to their resampling quality and computational complexity. Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in terms of resampling quality and computational complexity.

  • 16.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    On Resampling Algorithms for Particle Filters2007Report (Other academic)
    Abstract [en]

    In this paper a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms with respect to their resampling quality and computational complexity.Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in terms of resampling quality and computational complexity.

  • 17.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Relative Pose Calibration of a Spherical Camera and an IMU2008Report (Other academic)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a spherical camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The experimental results show that the method works well in practice.

  • 18.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Relative Pose Calibration of a Spherical Camera and an IMU2008In: Proceedings of the 7th IEEE and ACM International Symposium on Mixed and Augmented Reality, 2008, p. 21-24Conference paper (Refereed)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a spherical camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The experimental results show that the method works well in practice.

  • 19.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering.
    Schön, Thomas
    Linköping University, The Institute of Technology. Linköping University, Department of Electrical Engineering, Automatic Control.
    Gustafsson, Fredrik
    Linköping University, The Institute of Technology. Linköping University, Department of Electrical Engineering, Automatic Control.
    Resampling in Particle Filters2006In: Nonlinear Statistical Signal Processing Workshop,2006, Cambridge, United Kingdom: IEEE , 2006Conference paper (Refereed)
  • 20.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Sensor Fusion for Augmented Reality2006In: Proceedings of the 9th International Conference on Information Fusion, 2006Conference paper (Refereed)
    Abstract [en]

    In Augmented Reality (AR), the position and orientation of the camera have to be estimated with high accuracy and low latency. This nonlinear estimation problem is studied in the present paper. The proposed solution makes use of measurements from inertial sensors and computer vision. These measurements are fused using a Kalman filtering framework, incorporating a rather detailed model for the dynamics of the camera. Experiments show that the resulting filter provides good estimates of the camera motion, even during fast movements.

  • 21.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Sensor Fusion for Augmented Reality2006In: Proceedings of Reglermöte 2006, 2006Conference paper (Other academic)
    Abstract [en]

    In Augmented Reality (AR), the position and orientation of the camera have to be estimated with high accuracy and low latency. This nonlinear estimation problem is studied in the present paper. The proposed solution makes use of measurements from inertial sensors and computer vision. These measurements are fused using a Kalman filtering framework, incorporating a rather detailed model for the dynamics of the camera. Experiments show that the resulting filter provides good estimates of the camera motion, even during fast movements.

  • 22.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Slycke, Per
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Sensor Fusion for Augmented Reality2007Report (Other academic)
    Abstract [en]

    In Augmented Reality (AR), the position and orientation of the camera have to be estimated with high accuracy and low latency. This nonlinear estimation problem is studied in the present paper. The proposed solution makes use of measurements from inertial sensors and computer vision. These measurements are fused using a Kalman filtering framework, incorporating a rather detailed model for the dynamics of the camera. Experiments show that the resulting filter provides good estimates of the camera motion, even during fast movements.

  • 23.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Luinge, Henk
    Xsens Technologies B.V, The Netherlands.
    Slycke, Per
    Xsens Technologies B.V, The Netherlands.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Robust Real-Time Tracking by Fusing Measurements from Inertial and Vision Sensors2007Report (Other academic)
    Abstract [en]

    The problem of estimating and predicting position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and vision. The sensor fusion approach described in this contribution is based on non-linear filtering of these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a multi-rate extended Kalman filter is described, using a dynamic model with 22 states, where 12.5 Hz correspondences from vision and 100 Hz inertial measurements are processed. An example where an industrial robot is used to move the sensor unit is presented. The advantage with this configuration is that it provides ground truth for the pose, allowing for objective performance evaluation. The results show that we obtain an absolute accuracy of 2 cm in position and 1° in orientation.

  • 24.
    Hol, Jeroen
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Luinge, Henk
    Xsens Technologies B.V, The Netherlands.
    Slycke, Per
    Xsens Technologies B.V, The Netherlands.
    Gustafsson, Fredrik
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Robust Real-Time Tracking by Fusing Measurements from Inertial and Vision Sensors2007In: Journal of Real-Time Image Processing, ISSN 1861-8200, E-ISSN 1861-8219, Vol. 2, no 2-3, p. 149-160Article in journal (Refereed)
    Abstract [en]

    The problem of estimating and predicting position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and vision. The sensor fusion approach described in this contribution is based on non-linear filtering of these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a multi-rate extended Kalman filter is described, using a dynamic model with 22 states, where 12.5 Hz correspondences from vision and 100 Hz inertial measurements are processed. An example where an industrial robot is used to move the sensor unit is presented. The advantage with this configuration is that it provides ground truth for the pose, allowing for objective performance evaluation. The results show that we obtain an absolute accuracy of 2 cm in position and 1° in orientation.

  • 25.
    Hol, Jeroen.D
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Dijkstra, Fred
    Xsens Technologies B.V., Netherlands.
    Luinge, Henk
    Xsens Technologies B.V., Netherlands.
    Schön, Thomas
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Tightly Coupled UWB/IMU Pose Estimation2009Report (Other academic)
    Abstract [en]

    In this paper we propose a 6DOF tracking system combining Ultra-Wideband measurements with low-cost MEMS inertial measurements. A tightly coupled system is developed which estimates position as well as orientation of the sensorunit while being reliable in case of multipath effects and NLOS conditions. The experimental results show robust and continuous tracking in a realistic indoor positioning scenario.

  • 26.
    Karlsson, Rickard
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Törnqvist, David
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Sjöberg, Johan
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hansson, Anders
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Positioning and Control of an Unmanned Aerial Vehicle2006In: Proceedings of the 2nd International CDIO Conference and Collaborators' Meeting, 2006Conference paper (Refereed)
    Abstract [en]

    In the CDIO-project course in Automatic Control, an Autonomous Unmanned Aerial vehicle (UAV) is constructed, utilizing an existing radio controlled model aircraft. By adding an inertial sensor measuring acceleration and rotation, together with a Global Positioning System (GPS) sensor, the aim is to construct an accurate positioning system. This is used by an on board computer to calculate rudder control signals to a set of DC-servos in order to follow a predefined way-point trajectory. The project involves 17 students, which is roughly three times as big as previous projects, and it comprises both positioning, control, and hardware design. Since the project is still ongoing some preliminary results and conclusions are presented.

  • 27.
    Karlsson, Rickard
    et al.
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Törnqvist, David
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Sjöberg, Johan
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hol, Jeroen
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Hansson, Anders
    Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
    Positioning and Control of an Unmanned Aerial Vehicle2006Report (Other academic)
    Abstract [en]

    In the CDIO-project course in Automatic Control, an Autonomous Unmanned Aerial vehicle (UAV) is constructed, utilizing an existing radio controlled model aircraft. By adding an inertial sensor measuring acceleration and rotation, together with a Global Positioning System (GPS) sensor, the aim is to construct an accurate positioning system. This is used by an on board computer to calculate rudder control signals to a set of DC-servos in order to follow a predefined way-point trajectory. The project involves 17 students, which is roughly three times as big as previous projects, and it comprises both positioning, control, and hardware design. Since the project is still ongoing some preliminary results and conclusions are presented.

1 - 27 of 27
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf