liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Co-aligning Aerial Hyperspectral Push-broom Strips for Change Detection
Linköping University, Department of Electrical Engineering, Computer Vision. Linköping University, The Institute of Technology.
FOI, Swedish Defence Research Agency, Linköping, Sweden.ORCID iD: 0000-0002-6763-5487
FOI, Swedish Defence Research Agency, Linköping, Sweden.
Linköping University, Department of Electrical Engineering, Computer Vision. Linköping University, The Institute of Technology.ORCID iD: 0000-0002-5698-5983
2010 (English)In: Proc. SPIE 7835, Electro-Optical Remote Sensing, Photonic Technologies, and Applications IV / [ed] Gary W. Kamerman; Ove Steinvall; Keith L. Lewis; Richard C. Hollins; Thomas J. Merlet; Gary J. Bishop; John D. Gonglewski, SPIE - International Society for Optical Engineering, 2010, Art.nr. 7835B-36- p.Conference paper, Published paper (Refereed)
Abstract [en]

We have performed a field trial with an airborne push-broom hyperspectral sensor, making several flights over the same area and with known changes (e.g., moved vehicles) between the flights. Each flight results in a sequence of scan lines forming an image strip, and in order to detect changes between two flights, the two resulting image strips must be geometrically aligned and radiometrically corrected. The focus of this paper is the geometrical alignment, and we propose an image- and gyro-based method for geometric co-alignment (registration) of two image strips. The method is particularly useful when the sensor is not stabilized, thus reducing the need for expensive mechanical stabilization. The method works in several steps, including gyro-based rectification, global alignment using SIFT matching, and a local alignment using KLT tracking. Experimental results are shown but not quantified, as ground truth is, by the nature of the trial, lacking.

Place, publisher, year, edition, pages
SPIE - International Society for Optical Engineering, 2010. Art.nr. 7835B-36- p.
Series
Proceedings Spie, ISSN 0277-786X ; 7835
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
URN: urn:nbn:se:liu:diva-70464DOI: 10.1117/12.865034ISBN: 978-0-8194-8353-9 (print)OAI: oai:DiVA.org:liu-70464DiVA: diva2:440482
Conference
Electro-Optical Remote Sensing, Photonic Technologies, and Applications IV, 20-23 September, Toulouse, France
Available from: 2011-09-13 Created: 2011-09-09 Last updated: 2015-12-10Bibliographically approved
In thesis
1. Geometric Computer Vision for Rolling-shutter and Push-broom Sensors
Open this publication in new window or tab >>Geometric Computer Vision for Rolling-shutter and Push-broom Sensors
2012 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Almost all cell-phones and camcorders sold today are equipped with a CMOS (Complementary Metal Oxide Semiconductor) image sensor and there is also a general trend to incorporate CMOS sensors in other types of cameras. The sensor has many advantages over the more conventional CCD (Charge-Coupled Device) sensor such as lower power consumption, cheaper manufacturing and the potential for on-chip processing. Almost all CMOS sensors make use of what is called a rolling shutter. Compared to a global shutter, which images all the pixels at the same time, a rolling-shutter camera exposes the image row-by-row. This leads to geometric distortions in the image when either the camera or the objects in the scene are moving. The recorded videos and images will look wobbly (jello effect), skewed or otherwise strange and this is often not desirable. In addition, many computer vision algorithms assume that the camera used has a global shutter, and will break down if the distortions are too severe.

In airborne remote sensing it is common to use push-broom sensors. These sensors exhibit a similar kind of distortion as a rolling-shutter camera, due to the motion of the aircraft. If the acquired images are to be matched with maps or other images, then the distortions need to be suppressed.

The main contributions in this thesis are the development of the three dimensional models for rolling-shutter distortion correction. Previous attempts modelled the distortions as taking place in the image plane, and we have shown that our techniques give better results for hand-held camera motions.

The basic idea is to estimate the camera motion, not only between frames, but also the motion during frame capture. The motion can be estimated using inter-frame image correspondences and with these a non-linear optimisation problem can be formulated and solved. All rows in the rolling-shutter image are imaged at different times, and when the motion is known, each row can be transformed to the rectified position.

In addition to rolling-shutter distortions, hand-held footage often has shaky camera motion. It has been shown how to do efficient video stabilisation, in combination with the rectification, using rotation smoothing.

In the thesis it has been explored how to use similar techniques as for the rolling-shutter case in order to correct push-broom images, and also how to rectify 3D point clouds from e.g. the Kinect depth sensor.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press, 2012. 85 p.
Series
Linköping Studies in Science and Technology. Thesis, ISSN 0280-7971 ; 1535
Keyword
rolling shutter cmos video rectification stabilisation push-broom kinect
National Category
Engineering and Technology Computer Vision and Robotics (Autonomous Systems) Signal Processing
Identifiers
urn:nbn:se:liu:diva-77391 (URN)978-91-7519-872-9 (ISBN)
Presentation
2012-06-08, Visionen, Hus B, Campus Valla, Linköpings universitet, Linköping, 13:00 (English)
Opponent
Supervisors
Projects
VGS
Available from: 2012-05-28 Created: 2012-05-14 Last updated: 2015-12-10Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full text

Authority records BETA

Ringaby, ErikAhlberg, JörgenWadströmer, NiclasForssén, Per-Erik

Search in DiVA

By author/editor
Ringaby, ErikAhlberg, JörgenWadströmer, NiclasForssén, Per-Erik
By organisation
Computer VisionThe Institute of Technology
Computer Vision and Robotics (Autonomous Systems)

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 242 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf