liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Geometric Models for Rolling-shutter and Push-broom Sensors
Linköping University, Department of Electrical Engineering, Computer Vision. Linköping University, The Institute of Technology.
2014 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Almost all cell-phones and camcorders sold today are equipped with a  CMOS (Complementary Metal Oxide Semiconductor) image sensor and there is also a general trend to incorporate CMOS sensors in other types of cameras. The CMOS sensor has many advantages over the more conventional CCD (Charge-Coupled Device) sensor such as lower power consumption, cheaper manufacturing and the potential for onchip processing. Nearly all CMOS sensors make use of what is called a rolling shutter readout. Unlike a global shutter readout, which images all the pixels at the same time, a rolling-shutter exposes the image row-by-row. If a mechanical shutter is not used this will lead to geometric distortions in the image when either the camera or the objects in the scene are moving. Smaller cameras, like those in cell-phones, do not have mechanical shutters and systems that do have them will not use them when recording video. The result will look wobbly (jello eect), skewed or otherwise strange and this is often not desirable. In addition, many computer vision algorithms assume that the camera used has a global shutter and will break down if the distortions are too severe.

In airborne remote sensing it is common to use push-broom sensors. These sensors exhibit a similar kind of distortion as that of a rolling-shutter camera, due to the motion of the aircraft. If the acquired images are to be registered to maps or other images, the distortions need to be suppressed.

The main contributions in this thesis are the development of the three-dimensional models for rolling-shutter distortion correction. Previous attempts modelled the distortions as taking place in the image plane, and we have shown that our techniques give better results for hand-held camera motions. The basic idea is to estimate the camera motion, not only between frames, but also the motion during frame capture. The motion is estimated using image correspondences and with these a non-linear optimisation problem is formulated and solved. All rows in the rollingshutter image are imaged at dierent times, and when the motion is known, each row can be transformed to its rectied position. The same is true when using depth sensors such as the Microsoft Kinect, and the thesis describes how to estimate its 3D motion and how to rectify 3D point clouds.

In the thesis it has also been explored how to use similar techniques as for the rolling-shutter case, to correct push-broom images. When a transformation has been found, the images need to be resampled to a regular grid in order to be visualised. This can be done in many ways and dierent methods have been tested and adapted to the push-broom setup.

In addition to rolling-shutter distortions, hand-held footage often has shaky camera motion. It is possible to do ecient video stabilisation in combination with the rectication using rotation smoothing. Apart from these distortions, motion blur is a big problem for hand-held photography. The images will be blurry due to the camera motion and also noisy if taken in low light conditions. One of the contributions in the thesis is a method which uses gyroscope measurements and feature tracking to combine several images, taken with a smartphone, into one resulting image with less blur and noise. This enables the user to take photos which would have otherwise required a tripod.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2014. , 41 p.
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1615
National Category
Computer Vision and Robotics (Autonomous Systems) Computer Engineering
Identifiers
URN: urn:nbn:se:liu:diva-110085DOI: 10.3384/diss.diva-110085ISBN: 978-91-7519-255-0 (print)OAI: oai:DiVA.org:liu-110085DiVA: diva2:742702
Public defence
2014-09-19, Visionen, hus B, Campus Valla, Linköpings universitet, Linköping, 10:15 (English)
Opponent
Supervisors
Note

The research leading to this thesis has received funding from CENIIT through the Virtual Global Shutters for CMOS Cameras project.

Available from: 2014-09-02 Created: 2014-09-02 Last updated: 2015-12-10Bibliographically approved
List of papers
1. Rectifying rolling shutter video from hand-held devices
Open this publication in new window or tab >>Rectifying rolling shutter video from hand-held devices
2010 (English)In: IEEE  Conference on  Computer Vision and Pattern Recognition (CVPR), 2010, Los Alamitos, CA, USA: IEEE Computer Society, 2010, 507-514 p.Conference paper, Published paper (Other academic)
Abstract [en]

This paper presents a method for rectifying video sequences from rolling shutter (RS) cameras. In contrast to previous RS rectification attempts we model distortions as being caused by the 3D motion of the camera. The camera motion is parametrised as a continuous curve, with knots at the last row of each frame. Curve parameters are solved for using non-linear least squares over inter-frame correspondences obtained from a KLT tracker. We have generated synthetic RS sequences with associated ground-truth to allow controlled evaluation. Using these sequences, we demonstrate that our algorithm improves over to two previously published methods. The RS dataset is available on the web to allow comparison with other methods

Place, publisher, year, edition, pages
Los Alamitos, CA, USA: IEEE Computer Society, 2010
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-70572 (URN)10.1109/CVPR.2010.5540173 (DOI)978-1-4244-6984-0 (ISBN)
Conference
CVPR10, San Fransisco, USA, June 13-18, 2010
Available from: 2011-09-13 Created: 2011-09-13 Last updated: 2015-12-10
2. Efficient Video Rectification and Stabilisation for Cell-Phones
Open this publication in new window or tab >>Efficient Video Rectification and Stabilisation for Cell-Phones
2012 (English)In: International Journal of Computer Vision, ISSN 0920-5691, E-ISSN 1573-1405, Vol. 96, no 3, 335-352 p.Article in journal (Refereed) Published
Abstract [en]

This article presents a method for rectifying and stabilising video from cell-phones with rolling shutter (RS) cameras. Due to size constraints, cell-phone cameras have constant, or near constant focal length, making them an ideal application for calibrated projective geometry. In contrast to previous RS rectification attempts that model distortions in the image plane, we model the 3D rotation of the camera. We parameterise the camera rotation as a continuous curve, with knots distributed across a short frame interval. Curve parameters are found using non-linear least squares over inter-frame correspondences from a KLT tracker. By smoothing a sequence of reference rotations from the estimated curve, we can at a small extra cost, obtain a high-quality image stabilisation. Using synthetic RS sequences with associated ground-truth, we demonstrate that our rectification improves over two other methods. We also compare our video stabilisation with the methods in iMovie and Deshaker.

Place, publisher, year, edition, pages
Springer Verlag (Germany), 2012
Keyword
Cell-phone, Rolling shutter, CMOS, Video stabilisation
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-75277 (URN)10.1007/s11263-011-0465-8 (DOI)000299769400005 ()
Note
Funding Agencies|CENIIT organisation at Linkoping Institute of Technology||Swedish Research Council||Available from: 2012-02-27 Created: 2012-02-24 Last updated: 2017-12-07
3. Scan Rectification for Structured Light Range Sensors with Rolling Shutters
Open this publication in new window or tab >>Scan Rectification for Structured Light Range Sensors with Rolling Shutters
2011 (English)In: IEEE International Conference on Computer Vision, Barcelona Spain, 2011, 1575-1582 p.Conference paper, Published paper (Other academic)
Abstract [en]

Structured light range sensors, such as the Microsoft Kinect, have recently become popular as perception devices for computer vision and robotic systems. These sensors use CMOS imaging chips with electronic rolling shutters (ERS). When using such a sensor on a moving platform, both the image, and the depth map, will exhibit geometric distortions. We introduce an algorithm that can suppress such distortions, by rectifying the 3D point clouds from the range sensor. This is done by first estimating the time continuous 3D camera trajectory, and then transforming the 3D points to where they would have been, if the camera had been stationary. To ensure that image and range data are synchronous, the camera trajectory is computed from KLT tracks on the structured-light frames, after suppressing the structured-light pattern. We evaluate our rectification, by measuring angles between the visible sides of a cube, before and after rectification. We also measure how much better the 3D point clouds can be aligned after rectification. The obtained improvement is also related to the actual rotational velocity, measured using a MEMS gyroscope.

Place, publisher, year, edition, pages
Barcelona Spain: , 2011
Series
International Conference on Computer Vision (ICCV), ISSN 1550-5499
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-77059 (URN)10.1109/ICCV.2011.6126417 (DOI)978-1-4577-1101-5 (ISBN)
Conference
IEEE International Conference on Computer Vision(ICCV11), 8-11 November 2011, Barcelona, Spain
Available from: 2012-05-07 Created: 2012-05-03 Last updated: 2015-12-10Bibliographically approved
4. Co-alignmnent of Aerial Push-broom Strips using Trajectory Smoothness Constraints
Open this publication in new window or tab >>Co-alignmnent of Aerial Push-broom Strips using Trajectory Smoothness Constraints
2010 (English)Conference paper, Published paper (Other academic)
Abstract [en]

We study the problem of registering a sequence of scan lines (a strip) from an airborne push-broom imager to another sequence partly covering the same area. Such a registration has to compensate for deformations caused by attitude and speed changes in the aircraft. The registration is challenging, as both strips contain such deformations. Our algorithm estimates the 3D rotation of the camera for each scan line, by parametrising it as a linear spline with a number of knots evenly distributed in one of the strips. The rotations are estimated from correspondences between strips of the same area. Once the rotations are known, they can be compensated for, and each line of pixels can be transformed such that ground trace of the two strips are registered with respect to each other.

Place, publisher, year, edition, pages
Swedish Society for automated image analysis, 2010
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
urn:nbn:se:liu:diva-70706 (URN)
Conference
SSBA10, Symposium on Image Analysis 11-12 March, Uppsala
Available from: 2011-09-15 Created: 2011-09-15 Last updated: 2015-12-10Bibliographically approved
5. Anisotropic Scattered Data Interpolation for Pushbroom Image Rectification
Open this publication in new window or tab >>Anisotropic Scattered Data Interpolation for Pushbroom Image Rectification
Show others...
2014 (English)In: IEEE Transactions on Image Processing, ISSN 1057-7149, E-ISSN 1941-0042, Vol. 23, no 5, 2302-2314 p.Article in journal (Refereed) Published
Abstract [en]

This article deals with fast and accurate visualization of pushbroom image data from airborne and spaceborne platforms. A pushbroom sensor acquires images in a line-scanning fashion, and this results in scattered input data that needs to be resampled onto a uniform grid for geometrically correct visualization. To this end, we model the anisotropic spatial dependence structure caused by the acquisition process. Several methods for scattered data interpolation are then adapted to handle the induced anisotropic metric and compared for the pushbroom image rectification problem. A trick that exploits the semi-ordered line structure of pushbroom data to improve the computational complexity several orders of magnitude is also presented.

Place, publisher, year, edition, pages
IEEE, 2014
Keyword
pushbroom, rectification, hyperspectral, interpolation, anisotropic, scattered data
National Category
Engineering and Technology Electrical Engineering, Electronic Engineering, Information Engineering Signal Processing
Identifiers
urn:nbn:se:liu:diva-108105 (URN)10.1109/TIP.2014.2316377 (DOI)000350284400001 ()
Available from: 2014-06-25 Created: 2014-06-25 Last updated: 2017-12-05Bibliographically approved
6. A Virtual Tripod for Hand-held Video Stacking on Smartphones
Open this publication in new window or tab >>A Virtual Tripod for Hand-held Video Stacking on Smartphones
2014 (English)In: 2014 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL PHOTOGRAPHY (ICCP), IEEE , 2014Conference paper, Published paper (Refereed)
Abstract [en]

We propose an algorithm that can capture sharp, low-noise images in low-light conditions on a hand-held smartphone. We make use of the recent ability to acquire bursts of high resolution images on high-end models such as the iPhone5s. Frames are aligned, or stacked, using rolling shutter correction, based on motion estimated from the built-in gyro sensors and image feature tracking. After stacking, the images may be combined, using e.g. averaging to produce a sharp, low-noise photo. We have tested the algorithm on a variety of different scenes, using several different smartphones. We compare our method to denoising, direct stacking, as well as a global-shutter based stacking, with favourable results.

Place, publisher, year, edition, pages
IEEE, 2014
Series
IEEE International Conference on Computational Photography, ISSN 2164-9774
National Category
Engineering and Technology Electrical Engineering, Electronic Engineering, Information Engineering Signal Processing
Identifiers
urn:nbn:se:liu:diva-108109 (URN)10.1109/ICCPHOT.2014.6831799 (DOI)000356494100001 ()978-1-4799-5188-8 (ISBN)
Conference
IEEE International Conference on Computational Photography (ICCP 2014), May 2-4, 2014, Intel, Santa Clara, USA
Projects
VPS
Available from: 2014-06-25 Created: 2014-06-25 Last updated: 2015-12-10Bibliographically approved

Open Access in DiVA

Geometric Models for Rolling-shutter and Push-broom Sensors(4384 kB)2278 downloads
File information
File name FULLTEXT01.pdfFile size 4384 kBChecksum SHA-512
d50956d9459ec2539500bcd80e81a14f6ffd773729674cfdba438397ae2e0d1311ab3ed7f5a82e2f5c9be2a09c399f840525b3adba74a9c994781e19ad12f9dd
Type fulltextMimetype application/pdf
omslag(8921 kB)136 downloads
File information
File name COVER01.pdfFile size 8921 kBChecksum SHA-512
2c6984107a33373a4e43111f1d2db66f987be80f9724e2e692f98023424a1c5beb40cd8f96d058c0099e0ce2b3c72681ad7a943c90c55db19758e1af4fd24db2
Type coverMimetype application/pdf

Other links

Publisher's full text

Authority records BETA

Ringaby, Erik

Search in DiVA

By author/editor
Ringaby, Erik
By organisation
Computer VisionThe Institute of Technology
Computer Vision and Robotics (Autonomous Systems)Computer Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 2278 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 1153 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf