liu.seSearch for publications in DiVA
Change search
Refine search result
12 51 - 62 of 62
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 51.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Gustavson, Stefan
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Ynnerman, Anders
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Densely Sampled Light Probe Sequences for Spatially Variant Image Based Lighting2006In: The 4th International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, 2006 Kuala Lumpur, Malaysia, 2006, p. 341-347Conference paper (Refereed)
    Abstract [en]

    We present a novel technique for capturing spatially and temporally resolved light probe sequences, and using them for rendering. For this purpose we have designed and built a Real Time Light Probe; a catadioptric imaging system that can capture the full dynamic range of the lighting incident at each point in space at video frame rates, while being moved through a scene. The Real Time Light Probe uses a digital imaging system which we have programmed to capture high quality, photometrically accurate color images with a dynamic range of 10,000,000:1 at 25 frames per second.

    By tracking the position and orientation of the light probe, it is possible to transform each light probe into a common frame of reference in world coordinates, and map each point in space along the path of motion to a particular frame in the light probe sequence. We demonstrate our technique by rendering synthetic objects illuminated by complex real world lighting, using both traditional image based lighting methods with temporally varying light probe illumination and an extension to handle spatially varying lighting conditions across large objects.

  • 52.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Gustavson, Stefan
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Ynnerman, Anders
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    High Dynamic Range Video for Photometric Measurement of Illumination2007In: Sensors, Cameras, and Systems for Scientific/Industrial Applications VIII / [ed] Morley M. Blouke, Bellingham, Washington/Springfield, Virginia, USA: SPIE—The International Society for Optical Engineering & IS&T—The Society for Imaging Science and Technology , 2007, p. 65010E-1-65010E-10Conference paper (Refereed)
    Abstract [en]

    We describe the design and implementation of a high dynamic range (HDR) imaging system capable of capturing RGB color images with a dynamic range of 10,000,000 : 1 at 25 frames per second. We use a highly programmable camera unit with high throughput A/D conversion, data processing and data output. HDR acquisition is performed by multiple exposures in a continuous rolling shutter progression over the sensor. All the different exposures for one particular row of pixels are acquired head to tail within the frame time, which means that the time disparity between exposures is minimal, the entire frame time can be used for light integration and the longest expo- sure is almost the entire frame time. The system is highly configurable, and trade-offs are possible between dynamic range, precision, number of exposures, image resolution and frame rate.

  • 53.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Gustavson, Stefan
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Ynnerman, Anders
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Spatially Varying Image Based Lighting by Light Probe Sequences, Capture, Processing and Rendering2007In: The Visual Computer, ISSN 0178-2789, E-ISSN 1432-2315, Vol. 23, no 7, p. 453-465Article in journal (Refereed)
    Abstract [en]

    We present a novel technique for capturing spatially or temporally resolved light probe sequences, and using them for image based lighting. For this purpose we have designed and built a real-time light probe, a catadioptric imaging system that can capture the full dynamic range of the lighting incident at each point in space at video frame rates, while being moved through a scene. The real-time light probe uses a digital imaging system which we have programmed to capture high quality, photometrically accurate color images of 512×512 pixels with a dynamic range of 10000000:1 at 25 frames per second.

    By tracking the position and orientation of the light probe, it is possible to transform each light probe into a common frame of reference in world coordinates, and map each point and direction in space along the path of motion to a particular frame and pixel in the light probe sequence. We demonstrate our technique by rendering synthetic objects illuminated by complex real world lighting, first by using traditional image based lighting methods and temporally varying light probe illumination, and second an extension to handle spatially varying lighting conditions across large objects and object motion along an extended path.

  • 54.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
    Hajisharif, Saghi
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
    Kronander, Joel
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, Faculty of Science & Engineering.
    Unified reconstruction of RAW HDR video data2016In: High dynamic range video: from acquisition to display and applications / [ed] Frédéric Dufaux, Patrick Le Callet, Rafal K. Mantiuk, Marta Mrak, London, United Kingdom: Academic Press, 2016, 1st, p. 63-82Chapter in book (Other academic)
    Abstract [en]

    Traditional HDR capture has mostly relied on merging images captured with different exposure times. While this works well for static scenes, dynamic scenes poses difficult challenges as registration of differently exposed images often leads to ghosting and other artifacts. This chapter reviews methods which capture HDR-video frames within a single exposure time, using either multiple synchronised sensors, or by multiplexing of the sensor response spatially across the sensor. Most previous HDR reconstruction methods perform demoisaicing, noise reduction, resampling (registration), and HDR-fusion in separate steps. This chapter presents a framework for unified HDR-reconstruction, including all steps in the traditional imaging pipeline in a single adaptive filtering operation, and describes an image formation model and a sensor noise model applicable to both single-, and multi-sensor systems. The benefits of using raw data directly are demonstrated with examples using input data from multiple synchronized sensors, and single images with varying per-pixel gain.

  • 55.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Kronander, Joel
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Larsson, Per
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Gustavson, Stefan
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Löw, Joakim
    Linköping University, Department of Science and Technology. Linköping University, The Institute of Technology.
    Ynnerman, Anders
    Linköping University, Center for Medical Image Science and Visualization (CMIV). Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Spatially varying image based lighting using HDR-video2013In: Computers & graphics, ISSN 0097-8493, E-ISSN 1873-7684, Vol. 37, no 7, p. 923-934Article in journal (Refereed)
    Abstract [en]

    Illumination is one of the key components in the creation of realistic renderings of scenes containing virtual objects. In this paper, we present a set of novel algorithms and data structures for visualization, processing and rendering with real world lighting conditions captured using High Dynamic Range (HDR) video. The presented algorithms enable rapid construction of general and editable representations of the lighting environment, as well as extraction and fitting of sampled reflectance to parametric BRDF models. For efficient representation and rendering of the sampled lighting environment function, we consider an adaptive (2D/4D) data structure for storage of light field data on proxy geometry describing the scene. To demonstrate the usefulness of the algorithms, they are presented in the context of a fully integrated framework for spatially varying image based lighting. We show reconstructions of example scenes and resulting production quality renderings of virtual furniture with spatially varying real world illumination including occlusions.

  • 56.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Kronander, Joel
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Larsson, Per
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Gustavson, Stefan
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Ynner, Anders
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Temporally and Spatially Varying Image Based Lighting using HDR-video2013In: Proceedings of the 21st European Signal Processing Conference (EUSIPCO), 2013: Special Session on HDR-video, IEEE , 2013, p. 1-5Conference paper (Refereed)
    Abstract [en]

    In this paper we present novel algorithms and data structures for capturing, processing and rendering with real world lighting conditions based on high dynamic range video sequences. Based on the captured HDR video data we show how traditional image based lighting can be extended to include illumination variations in both the temporal as well as the spatial domain. This enables highly realistic renderings where traditional IBL techniques using a single light probe fail to capture important details in the real world lighting environment. To demonstrate the usefulness of our approach, we show examples of both off-line and real-time rendering applications.

  • 57.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Kronander, Joel
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Larsson, Per
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Gustavson, Stefan
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Ynnerman, Anders
    Linköping University, Center for Medical Image Science and Visualization (CMIV). Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Image Based Lighting using HDR-video2013In: Eurographics 24th Symposium on Rendering: Posters, 2013Conference paper (Other academic)
    Abstract [en]

    It has been widely recognized that lighting plays a key role in the realism and visual interest of computer graphics renderings. This hasled to research and development of image based lighting (IBL) techniques where the illumination conditions in real world scenes are captured as high dynamic range (HDR) image panoramas and used as lighting information during rendering. Traditional IBL where the lighting is captured at a single position in the scene has now become a widely used tool in most production pipelines. In this poster, we give an overview of a system pipeline where we use HDR-video cameras to extend traditional IBL techniques to capture real world lighting that may include variations in the spatial or temporal domains. We also describe how the capture systems and algorithms for processing and rendering have been incorporated into a robust systems pipeline for production of highly realisticrenderings. High dynamic range video based scene capture thus enables highly realistic renderings where traditional image based lighting, using a single light probe, fail to capture important details.

  • 58.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Ropinski, TimoLinköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Proceedings of SIGRAD 2013, Visual Computing, June 13-14, 2013, Norrköping, Sweden2013Conference proceedings (editor) (Refereed)
    Abstract [en]

    SIGRAD 2013

    We are happy to announce the 12th SIGRAD Conference Proceedings. SIGRAD 2013 will be held in Norrköping, Sweden, on June 13 and 14, 2013. SIGRAD 2013 focuses on visual computing, and solicits the submission of original research papers that advance the state-ofthe-art of one of the subareas of visual computing, ranging from computer graphics and visualization to human-computer-interaction.

    SIGRAD 2013 is the premier Nordic forum for computer graphics and visualization advances for academia, and industry. This annual event brings together researchers and practitioners with interest in techniques, tools, and technology from various fields such as computer graphics,visualization, visual analytics, or human-computer interaction. Each paper in this conference proceedings was peerreviewed by at least three reviewers from the international program committee consisting of 26 experts listed below. Based on this set of reviews, the conference co-chairs accepted 9 papers in total and compiled the final program.

    Jonas Unger and Timo Ropinski

  • 59.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Wenger, Andreas
    University of Southern California Institute for Creative Technologies, United States.
    Hawkins, Tim
    University of Southern California Institute for Creative Technologies, United States.
    Gardner, Andrew
    University of Southern California Institute for Creative Technologies, United States.
    Debevec, Pual
    University of Southern California Institute for Creative Technologies, United States.
    Capturing and Rendering with Incident Light Fields2003In: EGSR’03, The 14th Eurographics Symposium on Rendering 2003, Leuven, Belgium, 2003Conference paper (Refereed)
    Abstract [en]

    This paper presents a process for capturing spatially and directionally varying illumination from a real-world scene and using this lighting to illuminate computer-generated objects. We use two devices for capturing such illumination. In the first we photograph an array of mirrored spheres in high dynamic range to capture the spatially varying illumination. In the second, we obtain higher resolution data by capturing images with an high dynamic range omnidirectional camera as it traverses across a plane. For both methods we apply the light field technique to extrapolate the incident illumination to a volume. We render computer-generated objects as illuminated by this captured illumination using a custom shader within an existing global illumination rendering system. To demonstrate our technique we capture several spatially-varying lighting environments with spotlights, shadows, and dappled lighting and use them to illuminate synthetic scenes. We also show comparisons to real objects under the the same illumination.

  • 60.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Wrenninge, Magnus
    Linköping University, Department of Science and Technology. Linköping University, The Institute of Technology.
    Ollila, Mark
    Linköping University, Department of Science and Technology. Linköping University, The Institute of Technology.
    Real-time Image Based Lighting in Software using HDR Panoramas2003In: Proceedings of the 1st international conference on Computer graphics and interactive techniques in Australasia and South East Asia, 2003Conference paper (Refereed)
    Abstract [en]

    We present a system allowing real-time image based lighting basedon HDR panoramic images [Debevec 1998]. The system performstime-consuming diffuse light calculations in a pre-processing step,which is key to attaining interactivity. The real-time subsystem processesan image based lighting model in software, which wouldbe simple to implement in hardware. Rendering is handled byOpenGL, but could be substituted for another graphics API, shouldthere be such a need. Applications for the technique presentedare discussed, and includes methods for realistic outdoor lighting.The system architecture is outlined, describing the algorithms used.Lastly, the ideas for future work that arose during the project arediscussed

  • 61.
    Unger, Jonas
    et al.
    Linköping University, Department of Science and Technology, Visual Information Technology and Applications (VITA). Linköping University, The Institute of Technology.
    Wrenninge, Magnus
    Linköping University, Department of Science and Technology. Linköping University, The Institute of Technology.
    Wänström, Filip
    Linköping University, Department of Science and Technology. Linköping University, The Institute of Technology.
    Ollila, Mark
    Linköping University, Department of Science and Technology. Linköping University, The Institute of Technology.
    Implementation of a Real-time Image Based Lighting in Software Using HDR Panoramas2002In: In proceedings of SIGRAD'02, 2002Conference paper (Refereed)
  • 62.
    Wenger, Andreas
    et al.
    University of Southern California Institute for Creative Technologies.
    Gardner, Andrew
    University of Southern California Institute for Creative Technologies.
    Tchou, Chris
    University of Southern California Institute for Creative Technologies.
    Unger, Jonas
    University of Southern California .
    Hawkins, Tim
    University of Southern California Institute for Creative Technologies.
    Debevec, Paul
    University of Southern California Institute for Creative Technologies.
    Performance Relighting and Reflectance Transformation with Time-Multiplexed Illumination2005In: ACM Transactions on Graphics, ISSN 0730-0301, E-ISSN 1557-7368, Vol. 24, no 3Article in journal (Refereed)
    Abstract [en]

    We present a technique for capturing an actor’s live-action performance in such a way that the lighting and reflectance of the actor can be designed and modified in postproduction. Our approach is to illuminate the subject with a sequence of time-multiplexed basis lighting conditions, and to record these conditions with a highspeed video camera so that many conditions are recorded in the span of the desired output frame interval. We investigate several lighting bases for representing the sphere of incident illumination using a set of discrete LED light sources, and we estimate and compensate for subject motion using optical flow and image warping based on a set of tracking frames inserted into the lighting basis. To composite the illuminated performance into a new background, we include a time-multiplexed matte within the basis. We also show that the acquired data enables time-varying surface normals, albedo, and ambient occlusion to be estimated, which can be used to transform the actor’s reflectance to produce both subtle and stylistic effects.

12 51 - 62 of 62
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf