image-based rendering concepts - max planck societytheobalt/courses/... · image-based rendering...
Post on 16-May-2020
11 Views
Preview:
TRANSCRIPT
Image-based Rendering Concepts
Seminar on Computational Photography and Videography, WS 09/10Stefan Densow, 18.11.2009
Presentation based on:Light Field Rendering (1996) by Marc Levoy and Pat HanrahanThe Plenoptic Function and the Elements of Early Vision (1991) by Edward H. Adelson and James R. Bergen
1
Rendering a 3D Scene
• 3D geometry, some materials, a set of lights
• Projection onto image plane yields 2D image
source: wikipedia.org
2
Rendering a 3D Scene — Issues
• Complexity heavily dependent on scene complexity / realism / technique
• Computationally expensive, algorithmically complex
• ‘Just’ viewing the scene may be quite expensive
3
source: wikipedia.org© Egerter Software, 1999
Low-Cost Viewing of a 3D Scene
• Idea: Interpolate new 2D view from a set of pre-acquired 2D images
• “Image-based Rendering”
• Independent of scene complexity, algorithmically / computationally cheap
• Requires exhaustive acquisition of views
• Yields highly redundant imagery, highly compressible
• Restricted to static scenes, with views free of obstruction
• Key advantage: simple and robust viewing of a complex scene
4
General Idea: Description of the Visual World
• Set of light rays passing through any point: pencil or radiant pyramid
• Plenoptic function returns radiance
• Describes everything that can be seen
• Parameterization in Computer Vision:
P (θ,φ,λ, t, Vx, Vy, Vz)
P (x, y,λ, t, Vx, Vy, Vz)
5
Φ
θ
From the Plenoptic Function to Light Field
• Reduction of Plenoptic function ⇒ 5D light field
• Static light field: static scene with fixed lightning
• Radiance is constant along a ray
• ⇒ reduction to 4D light field
• How to parameterize the rays?
L(x, y, Vx, Vy, Vz)
6
Parameterization of Light Fields
• Line specified by its points of intersection with two planes
• Efficient calculation using homogeneous coordinates
• Camera is at uv-plane, object is at st-plane (focal plane)
• Different positions (u, v) are different perspectives for interpolating new views
• All rays that intersect both planes are of the same set of rays, called light slab
7
Light Slabs
8
y
x
y
x
uv-plane
st-plane
Light Slabs — Sampling Uniformity
Line Space
9
y
x
θ
r
y
x
θ
r
Light Slab Configurations
10
• Inward looking light slabs (90° view angle)
• Views from outside the convex hull of objects
• Orthographic views if uv-plane at infinity
• 360° view by 4 x 90° slabs
• Outward looking light slabs (90° view angle)
• View from the ‘inside’, e.g. for architecture
• st-plane at infinity
uv
st
Creation of Light Fields
• Source: Virtual (rendered images), real (digitized images) or mixed
11
Visualizing Light Slabs
12t
s
t
u
v
u
s
v
(a)
(b)
u
v
s
t
u
v
s
t
Rendered Images — Sampling from the Scene
13
st-plane
uv-plane
pixel grid
Rendered Images — Pixel Antialiasing
14
st-plane
uv-plane
pixel grid
Rendered Images — Anisotropic Scene Points
15
st-plane
uv-plane
pixel grid
Rendered Images — Aperture Antialiasing
16
st-plane
uv-plane
pixel grid
st-plane
uv-plane
pixel grid
Rendered Images — Effect of large Aperture
8x8 Light Slab(large aperture)
32x32 Light Slab(small aperture)
17
Rendered Images — 4D Low Pass Filter
• Pixel and Aperture Filter average in 2D at st- / uv-plane
• Idea behind both filters: ‘Gapless’ sampling and averaging to avoid aliasing
• Can only be approximated practical implementations
• Not all high-frequent changes in light field may be captured
• Combination yields 4D low pass filter, applied always with rendered imagery
• Little blurring requires high sampling density (with smaller aperture)
18
Light Fields from Digitized Images — Setup
• Inward looking setup with computer-controlled planar camera gantry
• Camera is panned and tilted at the center of the scene at each position
• Requires re-projection of each image onto common focal plane
• Small aperture is used for large depth-of-field to avoid refocusing
• Small aperture gives little uv-antialiasing (visible as jumps in new views)
• Each light slab gives 90° of angular view
• Rotatable object tripod and light allow 4 light slabs to yield 360° view
19
Light Fields from Digitized Images — Setup
20
Generating new Views from Light Fields
21
uv-plane
st-plane
Sampling from Light Fields — Nearest Neighbor
22
Sampling from Light Fields — Linear Interpolation
23
uv-plane
st-plane
Viewing Light Fields — Demo
Demonstration
Light field viewer and light fieldsprovided by the authors(source code available)
24
Summary
• Image-based rendering interpolates new 2D views of a 3D scene
• Based on restricted light fields that describe the radiance in a scene
• Imagery acquired in advance by rendering or digitizing
• Antialiasing required on pixel and aperture level
• Simple and robust interpolation of new views
• Low computational/algorithmic complexity, modest memory demands
25
Discussion
top related