liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Incident Light Fields
Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.ORCID iD: 0000-0002-7765-1747
2009 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Image based lighting, (IBL), is a computer graphics technique for creating photorealistic renderings of synthetic objects such that they can be placed into real world scenes. IBL has been widely recognized and is today used in commercial production pipelines. However, the current techniques only use illumination captured at a single point in space. This means that traditional IBL cannot capture or recreate effects such as cast shadows, shafts of light or other important spatial variations in the illumination. Such lighting effects are, in many cases, artistically created or are there to emphasize certain features, and are therefore a very important part of the visual appearance of a scene.

This thesis and the included papers present methods that extend IBL to allow for capture and rendering with spatially varying illumination. This is accomplished by measuring the light field incident onto a region in space, called an Incident Light Field, (ILF), and using it as illumination in renderings. This requires the illumination to be captured at a large number of points in space instead of just one. The complexity of the capture methods and rendering algorithms are then significantly increased.

The technique for measuring spatially varying illumination in real scenes is based on capture of High Dynamic Range, (HDR), image sequences. For efficient measurement, the image capture is performed at video frame rates. The captured illumination information in the image sequences is processed such that it can be used in computer graphics rendering. By extracting high intensity regions from the captured data and representing them separately, this thesis also describes a technique for increasing rendering efficiency and methods for editing the captured illumination, for example artificially moving or turning on and of individual light sources.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press , 2009. , 97 p.
Series
Linköping Studies in Science and Technology. Dissertations, ISSN 0345-7524 ; 1233
Keyword [en]
Computer Graphics, Image Based Lighting, Photorealistic Rendering, Light Fields, High Dynamic Range Imaging
National Category
Engineering and Technology
Identifiers
URN: urn:nbn:se:liu:diva-16287ISBN: 978-91-7393-717-7 (print)OAI: oai:DiVA.org:liu-16287DiVA: diva2:133606
Public defence
2009-01-30, K3, Kåkenhus, Campus Norrköping, Linköpings universitet, Norrköping, 10:15 (English)
Opponent
Supervisors
Available from: 2009-01-13 Created: 2009-01-13 Last updated: 2016-08-31Bibliographically approved
List of papers
1. Capturing and Rendering with Incident Light Fields
Open this publication in new window or tab >>Capturing and Rendering with Incident Light Fields
Show others...
2003 (English)In: EGSR’03, The 14th Eurographics Symposium on Rendering 2003, Leuven, Belgium, 2003Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents a process for capturing spatially and directionally varying illumination from a real-world scene and using this lighting to illuminate computer-generated objects. We use two devices for capturing such illumination. In the first we photograph an array of mirrored spheres in high dynamic range to capture the spatially varying illumination. In the second, we obtain higher resolution data by capturing images with an high dynamic range omnidirectional camera as it traverses across a plane. For both methods we apply the light field technique to extrapolate the incident illumination to a volume. We render computer-generated objects as illuminated by this captured illumination using a custom shader within an existing global illumination rendering system. To demonstrate our technique we capture several spatially-varying lighting environments with spotlights, shadows, and dappled lighting and use them to illuminate synthetic scenes. We also show comparisons to real objects under the the same illumination.

National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-16281 (URN)
Available from: 2009-01-13 Created: 2009-01-13 Last updated: 2015-09-22Bibliographically approved
2. A Real Time Light Probe
Open this publication in new window or tab >>A Real Time Light Probe
2004 (English)In: The 25th Eurographics Annual Conference 2004 Short papers and Interactive Applications, Grenoble, France, 2004Conference paper, Published paper (Refereed)
Abstract [en]

We present a novel system capable of capturing high dynamic range (HDR) Light Probes at video speed. Each Light Probe frame is built from an individual full set of exposures, all of which are captured within the frame time. The exposures are processed and assembled into a mantissa-exponent representation image within the camera unit before output, and then streamed to a standard PC. As an example, the system is capable of capturing Light Probe Images with a resolution of 512x512 pixels using a set of 10 exposures covering 15 f-stops at a frame rate of up to 25 final HDR frames per second. The system is built around commercial special-purpose camera hardware with on-chip programmable image processing logic and tightly integrated frame buffer memory, and the algorithm is implemented as custom downloadable microcode software.

National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-16282 (URN)
Conference
Eurographics Annual Conference 2004
Available from: 2009-01-13 Created: 2009-01-13 Last updated: 2015-09-22Bibliographically approved
3. Performance Relighting and Reflectance Transformation with Time-Multiplexed Illumination
Open this publication in new window or tab >>Performance Relighting and Reflectance Transformation with Time-Multiplexed Illumination
Show others...
2005 (English)In: ACM Transactions on Graphics, ISSN 0730-0301, E-ISSN 1557-7368, Vol. 24, no 3Article in journal (Refereed) Published
Abstract [en]

We present a technique for capturing an actor’s live-action performance in such a way that the lighting and reflectance of the actor can be designed and modified in postproduction. Our approach is to illuminate the subject with a sequence of time-multiplexed basis lighting conditions, and to record these conditions with a highspeed video camera so that many conditions are recorded in the span of the desired output frame interval. We investigate several lighting bases for representing the sphere of incident illumination using a set of discrete LED light sources, and we estimate and compensate for subject motion using optical flow and image warping based on a set of tracking frames inserted into the lighting basis. To composite the illuminated performance into a new background, we include a time-multiplexed matte within the basis. We also show that the acquired data enables time-varying surface normals, albedo, and ambient occlusion to be estimated, which can be used to transform the actor’s reflectance to produce both subtle and stylistic effects.

Keyword
Relighting, compositing, environmental illumination, image-based rendering, reflectance models
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-16283 (URN)10.1145/1073204.1073258 (DOI)
Available from: 2009-01-13 Created: 2009-01-13 Last updated: 2017-12-14Bibliographically approved
4. Densely Sampled Light Probe Sequences for Spatially Variant Image Based Lighting
Open this publication in new window or tab >>Densely Sampled Light Probe Sequences for Spatially Variant Image Based Lighting
2006 (English)In: The 4th International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, 2006 Kuala Lumpur, Malaysia, 2006, 341-347 p.Conference paper, Published paper (Refereed)
Abstract [en]

We present a novel technique for capturing spatially and temporally resolved light probe sequences, and using them for rendering. For this purpose we have designed and built a Real Time Light Probe; a catadioptric imaging system that can capture the full dynamic range of the lighting incident at each point in space at video frame rates, while being moved through a scene. The Real Time Light Probe uses a digital imaging system which we have programmed to capture high quality, photometrically accurate color images with a dynamic range of 10,000,000:1 at 25 frames per second.

By tracking the position and orientation of the light probe, it is possible to transform each light probe into a common frame of reference in world coordinates, and map each point in space along the path of motion to a particular frame in the light probe sequence. We demonstrate our technique by rendering synthetic objects illuminated by complex real world lighting, using both traditional image based lighting methods with temporally varying light probe illumination and an extension to handle spatially varying lighting conditions across large objects.

Keyword
HDR, Video, Image Based Lighting
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-16284 (URN)10.1145/1174429.1174487 (DOI)1-59593-564-9 (ISBN)
Available from: 2009-01-13 Created: 2009-01-13 Last updated: 2015-09-22Bibliographically approved
5. Spatially Varying Image Based Lighting by Light Probe Sequences, Capture, Processing and Rendering
Open this publication in new window or tab >>Spatially Varying Image Based Lighting by Light Probe Sequences, Capture, Processing and Rendering
2007 (English)In: The Visual Computer, ISSN 0178-2789, E-ISSN 1432-2315, Vol. 23, no 7, 453-465 p.Article in journal (Refereed) Published
Abstract [en]

We present a novel technique for capturing spatially or temporally resolved light probe sequences, and using them for image based lighting. For this purpose we have designed and built a real-time light probe, a catadioptric imaging system that can capture the full dynamic range of the lighting incident at each point in space at video frame rates, while being moved through a scene. The real-time light probe uses a digital imaging system which we have programmed to capture high quality, photometrically accurate color images of 512×512 pixels with a dynamic range of 10000000:1 at 25 frames per second.

By tracking the position and orientation of the light probe, it is possible to transform each light probe into a common frame of reference in world coordinates, and map each point and direction in space along the path of motion to a particular frame and pixel in the light probe sequence. We demonstrate our technique by rendering synthetic objects illuminated by complex real world lighting, first by using traditional image based lighting methods and temporally varying light probe illumination, and second an extension to handle spatially varying lighting conditions across large objects and object motion along an extended path.

Place, publisher, year, edition, pages
Springer Link, 2007
Keyword
High dynamic range imaging, Image based lighting
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-16285 (URN)10.1007/s00371-007-0127-6 (DOI)
Available from: 2009-01-13 Created: 2009-01-13 Last updated: 2017-12-14Bibliographically approved
6. Free Form Incident Light Fields
Open this publication in new window or tab >>Free Form Incident Light Fields
2008 (English)In: Computer graphics forum (Print), ISSN 0167-7055, E-ISSN 1467-8659, Vol. 27, no 4, 1293-1301 p.Article in journal (Refereed) Published
Abstract [en]

This paper presents methods for photo-realistic rendering using strongly spatially variant illumination captured from real scenes. The illumination is captured along arbitrary paths in space using a high dynamic range, HDR, video camera system with position tracking. Light samples are rearranged into 4-D incident light fields (ILF) suitable for direct use as illumination in renderings. Analysis of the captured data allows for estimation of the shape, position and spatial and angular properties of light sources in the scene. The estimated light sources can be extracted from the large 4D data set and handled separately to render scenes more efficiently and with higher quality. The ILF lighting can also be edited for detailed artistic control.

Place, publisher, year, edition, pages
Wiley InterScience, 2008
Keyword
Three-Dimensional Graphics and Realism, Digitization and Image Capture
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-16286 (URN)10.1111/j.1467-8659.2008.01268.x (DOI)
Available from: 2009-01-13 Created: 2009-01-13 Last updated: 2017-12-14Bibliographically approved

Open Access in DiVA

Incident Light Fields(2228 kB)1715 downloads
File information
File name FULLTEXT01.pdfFile size 2228 kBChecksum SHA-512
3f6bbba34f6e90da8b578255339537086aff32906a4a948ec25454d1c49c9508feb2ab5af4c2059a294c284b64b6c43df4a230851daf4a5b9bd8ccd27130ae88
Type fulltextMimetype application/pdf
Cover(135 kB)98 downloads
File information
File name COVER01.pdfFile size 135 kBChecksum SHA-512
32eb1af675e6a07774795dbc54d44d014458a8a9860ee7b16e1cf8faca139ce3abce5fb99bbd8d0bc482413c6afdfd7fa234fc02ffff96af7f187de19d30d964
Type coverMimetype application/pdf

Authority records BETA

Unger, Jonas

Search in DiVA

By author/editor
Unger, Jonas
By organisation
Visual Information Technology and Applications (VITA)The Institute of Technology
Engineering and Technology

Search outside of DiVA

GoogleGoogle Scholar
Total: 1715 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 16287 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf