virtual environments with light fields and lumigraphs cs 497 16 april 2002 presented by mike pilat

58
Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

Upload: gilbert-oconnor

Post on 27-Dec-2015

222 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

Virtual Environments with Light Fields and

Lumigraphs

CS 497

16 April 2002

Presented by Mike Pilat

Page 2: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

2

Three Papers

Light Field RenderingLevoy and HanrahanSigGraph 1996

The LumigraphGortler, Grzeszczuk, Szeliski and CohenSigGraph 1996

Dynamically Reparameterized Light FieldsIsaksen, McMillan and GortlerSigGraph 2000

Page 3: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

3

Introduction

Traditional 3D scenes are composed of geometric data and lights.

Can be costly to render new views Use set of images to compute new views of

scene Images are pre-acquired Solution: Image-based rendering

Page 4: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

4

Image-Based Rendering

Algorithms for IBR require little computational resources; suitable for PCs and workstations

Cost of interactive viewing independent of scene complexity

Image sources can come from: Real photographs Rendered images Both!

Page 5: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

5

A long, long time ago…

Environment maps 1976; Blinn: “Texture and Reflection in Computer

Generated Images” 1986; Greene: “Environment Mapping and Other

Applications of World Projections” They work the other way around too

1995; Chen: “QuickTime VR” Major limitation: Fixed view-point

Page 6: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

6

… In a galaxy far, far away

View Interpolation Many people: Chen, Greene, Fuchs, McMillan, Narayanan Not always practical

Requires depth value for each pixel in the environment maps.

Need to fill gaps created by occlusions. Image Interpolation

Laveau, McMillan, Seitz Need to find correspondences in two images

Stereo vision problem == hard Correspondences don’t always exist

Page 7: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

7

Light Fields

Page 8: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

8

Light Fields

Radiance at a point, in a particular direction Coined by Gershun

1936; Gershun: “The Light Field”

Equivalent to Plenoptic Function 1991; Adelson and Bergen: “The Plenoptic Function and

the Elements of Early Vision”

5D Light Fields 1995; McMillan and Bishop: “Plenoptic Modeling: An

Image-Based Rendering System” Panoramic images in 3D space

Page 9: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

9

Light Fields May reduce 5D representation to 4D in free space

(no occluders) Radiance does not change along a line unless blocked Redundant (read: 5D) representation is bad

Increased dataset size Complicates reconstruction of radiance samples

Can interpret 4D light fields as “functions on the space of oriented lines”

This is the representation of Levoy and Hanrahan Choose to parameterize lines by their intersection with two

planes in arbitrary positions. “Light Slab” – Restrict parameters to [0,1]

Page 10: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

10

Parameterization

Want parameterization to have three qualities Efficient Calculation

Want to use only view transform and pixel location Control over the set of lines

Only need a finite subset of line space Uniform Sampling

Number of lines between in intervals between samples is uniform

Can generate all views from one light slab But only if set of lines include all lines intersecting the

convex hull of the object Not possible, so need to use multiple slabs

Page 11: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

11

Huh?

Page 12: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

12

Creating Light Fields from Rendered Images

Render a 2D array of images

Each image is slice of 4D light slab

Place center of projection of camera at locations on uv-plane

Need to perform sheared perspective projection to force xy samples of the image to correspond exactly with the st samples

Page 13: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

13

Creating Light Fields from Digitized Images Many problems

Need hundreds or thousands of images Need to control lighting (must be static) Must manage optical constraints

Angle of View Focal Distance Depth of Field Aperture

Viewing Inward Looking

Flying around an object Outward Looking

Looking out from an object (inside a house, e.g.)

Page 14: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

14

Creating Light Fields from Digitized Images

Computer-controlled acquisition Spherical motion

Easier to cover entire range of viewing directions Sampling rate is more uniform

Planar motion Easier to build Closer to light slab representation

Constructed planar gantry x,y-movement Pitch and yaw

Page 15: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

15

Looking at a Light Field

Page 16: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

16

Compression Desirable properties

Data Redundancy Remove redundant info without affecting content Redundancy in all four dimensions

Random Access Samples referenced during rendering are scattered During movement access pattern changes

Asymmetry Assume light fields are pre-computed Long encoding time OK if decode is fast

Computationally Efficient Can’t consume full CPU power Need CPU time for display algorithm as well

Utilize two-stage compression scheme

Page 17: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

17

Compression

Vector Quantization (24:1) Vector of samples quantized into a predetermined

reproduction vector Reproduction vector called a “codeword” “codebook” of rep-vectors used to encode data

Entropy Coding (Lempel-Ziv) (5:1) Objects usually imaged on constant-intensity

background Many vectors from VQ occur with high probability

Page 18: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

18

Compression Two-stage decompression

Decode entropy coding while loading file End up with codebook and

code indices packed into 16-bit words.

De-quantization Engine requests samples

of light field Subscript index calculated Look up index in codebook Get vector of sample

values Subscript again for

requested sample

Digitization Slabs 4 Images/Slab 32x16 Pixels/Image 256x256 Total Size 402 MB

Compression VQ coding 24:1 Entropy coding 5:1 Total 120:1 Comp. Size 3.4 MB

Page 19: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

19

Displaying a Light Field

Step 1 Compute the (u,v,s,t)

parameters for each image ray

Step 2 Resample the radiance

at those line parameters

Page 20: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

20

Displaying a Light Field Step 1

Simple projective map from image coords to (u,v) and (s,t)

Can use texture mapping to compute line coords u = uw/w

v = vw/w(uw, vw, w)

Inverse map to (u,v,s,t) only two texcoord calculations per ray

Can be hardware accelerated!

Page 21: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

21

Displaying a Light Field Step 2

Reconstruct radiance function from original samples Approximate by

interpolation from nearest sample

Prefilter (4D mip map) if image is small

Quadralinear interpolation on full 4D function best

Apply band pass filter to remove HF noise that may cause aliasing

Page 22: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

22

Results

Page 23: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

23

Results

Page 24: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

24

Limitations

Need high sampling density Prevents excessive blurriness Need large number of images Thanks to coherence compression is good

Observer restricted to free space Can be addressed by stitching together multiple light fields

based on geometry partition

Requires fixed illumination If interreflections are ignore, can address by augmenting

light field with surface normals and optical properties

Page 25: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

25

Future Work

Abstract light representations have not received the systematic study as other methods of light representation Re-examine from first principles

Design better instrumentation for image acquisition Parallel array of cameras

Representing light interactions in high-dimension matrices And how to compress ‘em

Page 26: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

26

The Lumigraph

Page 27: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

27

Lumigraphs

Limit interest to light leaving the convex hull of a bounded object

Only need values of plenoptic function on a surrounding surface

Chose cube for computational simplicity

Page 28: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

28

Parameterized Déjà Vu?

Page 29: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

29

Parameterization

Use same method as Levoy and Hanrahan

Discrete subdivision of function space M in st, N in uv Basis Functions

Basis Projection

Duals of Basis functions

),,,(),,,(~

,,,0 0 0 0

,,, vutsBxvutsL qpji

M

i

M

j

N

p

N

qqpji

qpjiqpji BLx .,,,,,

~,

Page 30: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

30

Using Geometric Information

Use geometric information to learn about coherence of Lumigraph function

Modify basis functions to weight for that coherence

Use intersection depth

zz

issuu 1)('

)',',,(),,,( ,,,'

,,, vutsBvutsB pqjipqji

Page 31: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

31

Creating a Lumigraph

Page 32: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

32

Capturing from Synthetic Scenes

Capture a single sample per Lumigraph coefficient at each grid point

Place pinhole of camera at grid point Point down z axis Pixel values (p,q) are used in coefficients xi,j,p,q

To integrate against kernel B~, shoot multiple rays, jitter camera and pixel locations, weight each image using B~.

Page 33: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

33

Capturing from Real Images

Use motion control platform from L&H Capture images at grid points Would rather use hand-held camera Place calibration marks in scene Obtain silhouette through blue-screening to

get rough geometric shape Use silhouettes to build a volumetric model Guide user interactively in selecting new

positions

Page 34: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

34

Calibrating for Real Images

Page 35: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

35

Rebinning “The coefficient associated with the basis function B

is defined as the integral of the continuous Lumigraph function multiplied by some kernel function B~”

Three-step approach developed Splat Pull Push

dvdudtdsvutsBvutsLx qpjiqpji ),,,(~

),,,( ,,,,,,

Page 36: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

36

Splat, Push, Pull

Splatting – estimates integral Coefficents are computed with Monte-Carlo

integration Pulling – coefficients computed for

hierarchical set of basis functions Linearly sum together higher-resolution kernels

Pushing – information from lower grids “pushed” up to higher-res ones to fill in gaps Up sample and convolve

Page 37: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

37

Reconstruction

Ray Tracing Generate ray Calculate (s,t,u,v) coords Interpolate nearby grid

points

Texture Mapping One texture associated

with each st grid point Project pixel onto uv-

plane

Page 38: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

38

Results

Page 39: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

39

Results

Page 40: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

40

Dynamically Reparameterized Light Fields

Page 41: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

41

New parameterizations

Interactive Rendering Moderately sampled light fields Significant, unknown depth variation

Passive autostereoscopic viewing “fly’s eye” lens array Computation for synthesizing a new view comes

directly from optics of display device

To achieve desired flexibility, do not perform aperture filter from original paper.

Page 42: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

42

Light Slabs (Again)

Two-plane parameterization impacts reconstruction filters

Aperture filtering removes HF data

But if plane is wrong place, only stores blurred version of scene

Page 43: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

43

Focal Surface Parameterization

2D array of pinhole cameras, treated as a single optical system

Each data camera Ds,t can have its own internal orientation and parameters

Focal surface F parameterized by (f,g)F

Page 44: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

44

Ray Construction

Given a ray r, can find rays (s’,t’,u’,v’) and (s’’,t’’,u’’,v’’) in Ds’,t’ and Ds’’,t’’ that intersect at the same (f,g)F

Combine values with a filter

In practice, use more rays and weight the filter accordingly

Page 45: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

45

Variable Aperture

Real camera aperture produces depth-of-field effects

Emulate depth-of-field effects by combining rays from several cameras

Combine samples for each Ds,t in the synthetic aperture to produce image

Page 46: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

46

Variable Focus

Can also create variable focus effects

Multiple focal surfaces Can use one or many

simultaneously Similar ray construction

technique

Page 47: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

47

Boring Stuff

Ray-Space Analysis Examine the effects of dynamic

reparameterization on light fields when viewed from ray space.

Frequency Domain Analysis Shearing can arbitrarily modify the relative

sampling frequencies between dimensions in ray space

See paper

Page 48: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

48

Rendering

Ray Tracing Trace rays like before

Texture Mapping If the desired camera,

data cameras, and focal surface are all planar, can apply texture mapping techniques similar to Lumigraph method.

Page 49: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

49

Autostereoscopic Light Fields

Can think of the parameterization as a integral photograph, or the “fly’s eye view”

Each lenslet treated as single pixel

Page 50: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

50

View-Dependent Color

Each lenslet can have view dependent color Draw a ray through the principle point of the lenslet Use paraxial approximation to determine which color

will be seen from a direction

Page 51: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

51

Results

Page 52: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

52

Results

Page 53: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

53

Results

Page 54: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

54

Results

Page 55: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

55

Results

Page 56: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

56

Results

Page 57: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

57

Future Work

Develop method to exploit redundancy of light fields

Develop algorithm for automatically selecting the focal plane (like camcorders)

Promise for using reparameterization for depth-from-focus and depth-from-defocus problems

Page 58: Virtual Environments with Light Fields and Lumigraphs CS 497 16 April 2002 Presented by Mike Pilat

58

Fin

Questions?

Thanks!