liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Geometric Computer Vision for Rolling-shutter and Push-broom Sensors
Linköping University, Department of Electrical Engineering, Computer Vision. Linköping University, The Institute of Technology.
2012 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Almost all cell-phones and camcorders sold today are equipped with a CMOS (Complementary Metal Oxide Semiconductor) image sensor and there is also a general trend to incorporate CMOS sensors in other types of cameras. The sensor has many advantages over the more conventional CCD (Charge-Coupled Device) sensor such as lower power consumption, cheaper manufacturing and the potential for on-chip processing. Almost all CMOS sensors make use of what is called a rolling shutter. Compared to a global shutter, which images all the pixels at the same time, a rolling-shutter camera exposes the image row-by-row. This leads to geometric distortions in the image when either the camera or the objects in the scene are moving. The recorded videos and images will look wobbly (jello effect), skewed or otherwise strange and this is often not desirable. In addition, many computer vision algorithms assume that the camera used has a global shutter, and will break down if the distortions are too severe.

In airborne remote sensing it is common to use push-broom sensors. These sensors exhibit a similar kind of distortion as a rolling-shutter camera, due to the motion of the aircraft. If the acquired images are to be matched with maps or other images, then the distortions need to be suppressed.

The main contributions in this thesis are the development of the three dimensional models for rolling-shutter distortion correction. Previous attempts modelled the distortions as taking place in the image plane, and we have shown that our techniques give better results for hand-held camera motions.

The basic idea is to estimate the camera motion, not only between frames, but also the motion during frame capture. The motion can be estimated using inter-frame image correspondences and with these a non-linear optimisation problem can be formulated and solved. All rows in the rolling-shutter image are imaged at different times, and when the motion is known, each row can be transformed to the rectified position.

In addition to rolling-shutter distortions, hand-held footage often has shaky camera motion. It has been shown how to do efficient video stabilisation, in combination with the rectification, using rotation smoothing.

In the thesis it has been explored how to use similar techniques as for the rolling-shutter case in order to correct push-broom images, and also how to rectify 3D point clouds from e.g. the Kinect depth sensor.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2012. , 85 p.
Series
Linköping Studies in Science and Technology. Thesis, ISSN 0280-7971 ; 1535
Keyword [en]
rolling shutter cmos video rectification stabilisation push-broom kinect
National Category
Engineering and Technology Computer Vision and Robotics (Autonomous Systems) Signal Processing
Identifiers
URN: urn:nbn:se:liu:diva-77391ISBN: 978-91-7519-872-9 (print)OAI: oai:DiVA.org:liu-77391DiVA: diva2:526675
Presentation
2012-06-08, Visionen, Hus B, Campus Valla, Linköpings universitet, Linköping, 13:00 (English)
Opponent
Supervisors
Projects
VGS
Available from: 2012-05-28 Created: 2012-05-14 Last updated: 2015-12-10Bibliographically approved
List of papers
1. Rectifying rolling shutter video from hand-held devices
Open this publication in new window or tab >>Rectifying rolling shutter video from hand-held devices
2010 (English)In: IEEE  Conference on  Computer Vision and Pattern Recognition (CVPR), 2010, Los Alamitos, CA, USA: IEEE Computer Society, 2010, 507-514 p.Conference paper, Published paper (Other academic)
Abstract [en]

This paper presents a method for rectifying video sequences from rolling shutter (RS) cameras. In contrast to previous RS rectification attempts we model distortions as being caused by the 3D motion of the camera. The camera motion is parametrised as a continuous curve, with knots at the last row of each frame. Curve parameters are solved for using non-linear least squares over inter-frame correspondences obtained from a KLT tracker. We have generated synthetic RS sequences with associated ground-truth to allow controlled evaluation. Using these sequences, we demonstrate that our algorithm improves over to two previously published methods. The RS dataset is available on the web to allow comparison with other methods

Place, publisher, year, edition, pages
Los Alamitos, CA, USA: IEEE Computer Society, 2010
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-70572 (URN)10.1109/CVPR.2010.5540173 (DOI)978-1-4244-6984-0 (ISBN)
Conference
CVPR10, San Fransisco, USA, June 13-18, 2010
Available from: 2011-09-13 Created: 2011-09-13 Last updated: 2015-12-10
2. Efficient Video Rectification and Stabilisation for Cell-Phones
Open this publication in new window or tab >>Efficient Video Rectification and Stabilisation for Cell-Phones
2012 (English)In: International Journal of Computer Vision, ISSN 0920-5691, E-ISSN 1573-1405, Vol. 96, no 3, 335-352 p.Article in journal (Refereed) Published
Abstract [en]

This article presents a method for rectifying and stabilising video from cell-phones with rolling shutter (RS) cameras. Due to size constraints, cell-phone cameras have constant, or near constant focal length, making them an ideal application for calibrated projective geometry. In contrast to previous RS rectification attempts that model distortions in the image plane, we model the 3D rotation of the camera. We parameterise the camera rotation as a continuous curve, with knots distributed across a short frame interval. Curve parameters are found using non-linear least squares over inter-frame correspondences from a KLT tracker. By smoothing a sequence of reference rotations from the estimated curve, we can at a small extra cost, obtain a high-quality image stabilisation. Using synthetic RS sequences with associated ground-truth, we demonstrate that our rectification improves over two other methods. We also compare our video stabilisation with the methods in iMovie and Deshaker.

Place, publisher, year, edition, pages
Springer Verlag (Germany), 2012
Keyword
Cell-phone, Rolling shutter, CMOS, Video stabilisation
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-75277 (URN)10.1007/s11263-011-0465-8 (DOI)000299769400005 ()
Note
Funding Agencies|CENIIT organisation at Linkoping Institute of Technology||Swedish Research Council||Available from: 2012-02-27 Created: 2012-02-24 Last updated: 2017-12-07
3. Scan Rectification for Structured Light Range Sensors with Rolling Shutters
Open this publication in new window or tab >>Scan Rectification for Structured Light Range Sensors with Rolling Shutters
2011 (English)In: IEEE International Conference on Computer Vision, Barcelona Spain, 2011, 1575-1582 p.Conference paper, Published paper (Other academic)
Abstract [en]

Structured light range sensors, such as the Microsoft Kinect, have recently become popular as perception devices for computer vision and robotic systems. These sensors use CMOS imaging chips with electronic rolling shutters (ERS). When using such a sensor on a moving platform, both the image, and the depth map, will exhibit geometric distortions. We introduce an algorithm that can suppress such distortions, by rectifying the 3D point clouds from the range sensor. This is done by first estimating the time continuous 3D camera trajectory, and then transforming the 3D points to where they would have been, if the camera had been stationary. To ensure that image and range data are synchronous, the camera trajectory is computed from KLT tracks on the structured-light frames, after suppressing the structured-light pattern. We evaluate our rectification, by measuring angles between the visible sides of a cube, before and after rectification. We also measure how much better the 3D point clouds can be aligned after rectification. The obtained improvement is also related to the actual rotational velocity, measured using a MEMS gyroscope.

Place, publisher, year, edition, pages
Barcelona Spain: , 2011
Series
International Conference on Computer Vision (ICCV), ISSN 1550-5499
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-77059 (URN)10.1109/ICCV.2011.6126417 (DOI)978-1-4577-1101-5 (ISBN)
Conference
IEEE International Conference on Computer Vision(ICCV11), 8-11 November 2011, Barcelona, Spain
Available from: 2012-05-07 Created: 2012-05-03 Last updated: 2015-12-10Bibliographically approved
4. Co-alignmnent of Aerial Push-broom Strips using Trajectory Smoothness Constraints
Open this publication in new window or tab >>Co-alignmnent of Aerial Push-broom Strips using Trajectory Smoothness Constraints
2010 (English)Conference paper, Published paper (Other academic)
Abstract [en]

We study the problem of registering a sequence of scan lines (a strip) from an airborne push-broom imager to another sequence partly covering the same area. Such a registration has to compensate for deformations caused by attitude and speed changes in the aircraft. The registration is challenging, as both strips contain such deformations. Our algorithm estimates the 3D rotation of the camera for each scan line, by parametrising it as a linear spline with a number of knots evenly distributed in one of the strips. The rotations are estimated from correspondences between strips of the same area. Once the rotations are known, they can be compensated for, and each line of pixels can be transformed such that ground trace of the two strips are registered with respect to each other.

Place, publisher, year, edition, pages
Swedish Society for automated image analysis, 2010
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:liu:diva-70706 (URN)
Conference
SSBA10, Symposium on Image Analysis 11-12 March, Uppsala
Available from: 2011-09-15 Created: 2011-09-15 Last updated: 2015-12-10Bibliographically approved
5. Co-aligning Aerial Hyperspectral Push-broom Strips for Change Detection
Open this publication in new window or tab >>Co-aligning Aerial Hyperspectral Push-broom Strips for Change Detection
2010 (English)In: Proc. SPIE 7835, Electro-Optical Remote Sensing, Photonic Technologies, and Applications IV / [ed] Gary W. Kamerman; Ove Steinvall; Keith L. Lewis; Richard C. Hollins; Thomas J. Merlet; Gary J. Bishop; John D. Gonglewski, SPIE - International Society for Optical Engineering, 2010, Art.nr. 7835B-36- p.Conference paper, Published paper (Refereed)
Abstract [en]

We have performed a field trial with an airborne push-broom hyperspectral sensor, making several flights over the same area and with known changes (e.g., moved vehicles) between the flights. Each flight results in a sequence of scan lines forming an image strip, and in order to detect changes between two flights, the two resulting image strips must be geometrically aligned and radiometrically corrected. The focus of this paper is the geometrical alignment, and we propose an image- and gyro-based method for geometric co-alignment (registration) of two image strips. The method is particularly useful when the sensor is not stabilized, thus reducing the need for expensive mechanical stabilization. The method works in several steps, including gyro-based rectification, global alignment using SIFT matching, and a local alignment using KLT tracking. Experimental results are shown but not quantified, as ground truth is, by the nature of the trial, lacking.

Place, publisher, year, edition, pages
SPIE - International Society for Optical Engineering, 2010
Series
Proceedings Spie, ISSN 0277-786X ; 7835
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:liu:diva-70464 (URN)10.1117/12.865034 (DOI)978-0-8194-8353-9 (ISBN)
Conference
Electro-Optical Remote Sensing, Photonic Technologies, and Applications IV, 20-23 September, Toulouse, France
Available from: 2011-09-13 Created: 2011-09-09 Last updated: 2015-12-10Bibliographically approved

Open Access in DiVA

Geometric Computer Vision for Rolling-shutter and Push-broom Sensors(11762 kB)1525 downloads
File information
File name FULLTEXT01.pdfFile size 11762 kBChecksum SHA-512
6219ec932706b6c03b0563dd80fa35958f093297d3aef62caa2820fe671b9bf232930440f1cd672cee45aa2f4ab8472924f4efd4d98d875fcffc037ff5478561
Type fulltextMimetype application/pdf
cover(1021 kB)51 downloads
File information
File name COVER01.pdfFile size 1021 kBChecksum SHA-512
10ac40a2631e259f7247e5e90a442e4017ec6a45bdcb5e7c9c18590627870cf2e25b598dd79e83f14e5a87f25de5e277771e1dafd24653b4304857a26bb92467
Type coverMimetype application/pdf

Authority records BETA

Ringaby, Erik

Search in DiVA

By author/editor
Ringaby, Erik
By organisation
Computer VisionThe Institute of Technology
Engineering and TechnologyComputer Vision and Robotics (Autonomous Systems)Signal Processing

Search outside of DiVA

GoogleGoogle Scholar
Total: 1525 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 689 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf