liu.seSearch for publications in DiVA
Change search
ReferencesLink to record
Permanent link

Direct link
Tracking Stationary Extended Objects for Road Mapping using Radar Measurements
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology. (Sensor Fusion)
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology. (Sensor Fusion)
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology. (Sensor Fusion)
2009 (English)In: Proceedings of the '09 IEEE Intelligent Vehicle Symposium, IEEE , 2009, 405-410 p.Conference paper (Refereed)
Abstract [en]

It is getting more common that premium cars areequipped with a forward looking radar and a forward looking camera. The data is often used to estimate the road geometry, tracking leading vehicles, etc. However, there is valuable information present in the radar concerning stationary objects, that is typically not used. The present work shows how stationary objects, such as guard rails, can be modeled and tracked as extended objects using radar measurements. The problem is cast within a standard sensor fusion framework utilizing the Kalman filter. The approach has been evaluated on real datafrom highways and rural roads in Sweden.

Place, publisher, year, edition, pages
IEEE , 2009. 405-410 p.
, IEEE Intelligent Vehicles Symposium. Proceedings, ISSN 1931-0587
Keyword [en]
Extended objects, Object detection, Radar imaging, Road vehicle radar, Object tracking, Road mapping, Stationary objects
National Category
Engineering and Technology Control Engineering
URN: urn:nbn:se:liu:diva-18179DOI: 10.1109/IVS.2009.5164312ISBN: 978-1-4244-3504-3ISBN: 978-1-4244-3503-6OAI: diva2:216514
'09 IEEE Intelligent Vehicle Symposium, Xi’an, China, June, 2009
Available from: 2011-08-12 Created: 2009-05-09 Last updated: 2013-07-22Bibliographically approved
In thesis
1. Automotive Sensor Fusion for Situation Awareness
Open this publication in new window or tab >>Automotive Sensor Fusion for Situation Awareness
2009 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

The use of radar and camera for situation awareness is gaining popularity in automotivesafety applications. In this thesis situation awareness consists of accurate estimates of theego vehicle’s motion, the position of the other vehicles and the road geometry. By fusinginformation from different types of sensors, such as radar, camera and inertial sensor, theaccuracy and robustness of those estimates can be increased.

Sensor fusion is the process of using information from several different sensors tocompute an estimate of the state of a dynamic system, that in some sense is better thanit would be if the sensors were used individually. Furthermore, the resulting estimate isin some cases only obtainable through the use of data from different types of sensors. Asystematic approach to handle sensor fusion problems is provided by model based stateestimation theory. The systems discussed in this thesis are primarily dynamic and they aremodeled using state space models. A measurement model is used to describe the relationbetween the state variables and the measurements from the different sensors. Within thestate estimation framework a process model is used to describe how the state variablespropagate in time. These two models are of major importance for the resulting stateestimate and are therefore given much attention in this thesis. One example of a processmodel is the single track vehicle model, which is used to model the ego vehicle’s motion.In this thesis it is shown how the estimate of the road geometry obtained directly from thecamera information can be improved by fusing it with the estimates of the other vehicles’positions on the road and the estimate of the radius of the ego vehicle’s currently drivenpath.

The positions of stationary objects, such as guardrails, lampposts and delineators aremeasured by the radar. These measurements can be used to estimate the border of theroad. Three conceptually different methods to represent and derive the road borders arepresented in this thesis. Occupancy grid mapping discretizes the map surrounding theego vehicle and the probability of occupancy is estimated for each grid cell. The secondmethod applies a constrained quadratic program in order to estimate the road borders,which are represented by two polynomials. The third method associates the radar measurementsto extended stationary objects and tracks them as extended targets.

The approaches presented in this thesis have all been evaluated on real data from bothfreeways and rural roads in Sweden.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2009. 76 p.
Linköping Studies in Science and Technology. Thesis, ISSN 0280-7971 ; 1422
National Category
Information Science
urn:nbn:se:liu:diva-51226 (URN)LiU-TEK-LIC-2009:30 (Local ID)978-91-7393-492-3 (ISBN)LiU-TEK-LIC-2009:30 (Archive number)LiU-TEK-LIC-2009:30 (OAI)
2009-11-20, Visionen, B-building, Campus Valla, Linköpings universitet, Linköping, 10:15 (English)
Available from: 2009-10-23 Created: 2009-10-22 Last updated: 2009-10-23Bibliographically approved

Open Access in DiVA

fulltext(485 kB)516 downloads
File information
File name FULLTEXT01.pdfFile size 485 kBChecksum SHA-512
Type fulltextMimetype application/pdf

Other links

Publisher's full textLink to Licentiate Thesis

Search in DiVA

By author/editor
Lundquist, ChristianSchön, Thomas
By organisation
Automatic ControlThe Institute of Technology
Engineering and TechnologyControl Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 516 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Altmetric score

Total: 193 hits
ReferencesLink to record
Permanent link

Direct link