02 fall09 lecture sept18web

169
MIT Media Lab MIT Media Lab Camera Culture Camera Culture Ramesh Raskar Ramesh Raskar MIT Media Lab MIT Media Lab http:// CameraCulture . info/ http:// CameraCulture . info/ Computational Camera & Computational Camera & Photography: Photography:

Upload: lensbricks

Post on 10-May-2015

1.010 views

Category:

Technology


5 download

TRANSCRIPT

Page 1: 02 Fall09 Lecture Sept18web

MIT Media LabMIT Media Lab

Camera CultureCamera Culture

Ramesh RaskarRamesh Raskar

MIT Media LabMIT Media Lab

http:// CameraCulture . info/http:// CameraCulture . info/

Computational Camera & Computational Camera & Photography:Photography:

Computational Camera & Computational Camera & Photography:Photography:

Page 2: 02 Fall09 Lecture Sept18web

Where are the ‘cameras’?Where are the ‘cameras’?

Page 3: 02 Fall09 Lecture Sept18web
Page 4: 02 Fall09 Lecture Sept18web

Poll, Sept 18Poll, Sept 18thth 2009 2009

• When will DSCamera disappear? Why?When will DSCamera disappear? Why?

Like Wristwatches ?

Page 5: 02 Fall09 Lecture Sept18web

Taking NotesTaking Notes

• Use slides I post on the siteUse slides I post on the site

• Write down anecdotes and storiesWrite down anecdotes and stories

• Try to get what is NOT on the slideTry to get what is NOT on the slide

• Summarize questions and answersSummarize questions and answers

• Take photos of demos + doodles on boardTake photos of demos + doodles on board

– Use laptop to take notesUse laptop to take notes– Send before next MondaySend before next Monday

Page 6: 02 Fall09 Lecture Sept18web

Synthetic LightingSynthetic LightingPaul Haeberli, Jan 1992Paul Haeberli, Jan 1992

Page 7: 02 Fall09 Lecture Sept18web

HomeworkHomework

• Take multiple photos by changing lighting other parameters. Take multiple photos by changing lighting other parameters. Be creative.Be creative.

• Mix and match color channels to relightMix and match color channels to relight

• Due Sept 25Due Sept 25thth

• Submit on Stellar (via link):Submit on Stellar (via link):– Commented Source codeCommented Source code– Input images and output images PLUS intermediate resultsInput images and output images PLUS intermediate results– CREATE a webpage and send me a linkCREATE a webpage and send me a link

• Ok to use online softwareOk to use online software• Update results on Flickr (group) pageUpdate results on Flickr (group) page

Page 8: 02 Fall09 Lecture Sept18web

Debevec et al. 2002: ‘Light Stage 3’

Page 9: 02 Fall09 Lecture Sept18web

Image-Based Actual Re-lightingImage-Based Actual Re-lighting

Film the background in Milan,Film the background in Milan,Measure incoming light,Measure incoming light,

Light the actress in Los AngelesLight the actress in Los Angeles

Matte the backgroundMatte the background

Matched LA and Milan lighting.Matched LA and Milan lighting.

Debevec et al., SIGG2001

Page 10: 02 Fall09 Lecture Sept18web

Second HomeworkSecond Homework

• Extending Andrew Adam’s Virtual Optical BenchExtending Andrew Adam’s Virtual Optical Bench

Page 11: 02 Fall09 Lecture Sept18web

Dual photography from diffuse reflections: Dual photography from diffuse reflections: Homework Assignment 2Homework Assignment 2

the camera’s viewSen et al, Siggraph 2005Sen et al, Siggraph 2005

Page 12: 02 Fall09 Lecture Sept18web

Beyond Visible SpectrumBeyond Visible Spectrum

CedipRedShift

Page 13: 02 Fall09 Lecture Sept18web

• Brief IntroductionsBrief Introductions• Are you a photographer ?Are you a photographer ?• Do you use camera for vision/image Do you use camera for vision/image processing? Real-time processing?processing? Real-time processing?

• Do you have background in optics/sensors?Do you have background in optics/sensors?

• Name, Dept, Year, Why you are hereName, Dept, Year, Why you are here

• Are you on mailing list? On Stellar?Are you on mailing list? On Stellar?• Did you get email from me?Did you get email from me?

Page 14: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Page 15: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Dark Bldgs

Reflections on bldgs

Unknown shapes

Page 16: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

‘Well-lit’ Bldgs

Reflections in bldgs windows

Tree, Street shapes

Page 17: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, YuBackground is captured from day-time scene using the same fixed camera

Night Image

Day Image

Context Enhanced Image

Page 18: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Mask is automatically computed from scene contrast

Page 19: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

But, Simple Pixel Blending Creates Ugly Artifacts

Page 20: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, YuPixel Blending

Page 21: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, YuPixel Blending

Our Method:Integration of

blended Gradients

Page 22: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Rene Magritte, ‘Empire of the Light’

Surrealism

Page 23: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Time-lapse MosaicsTime-lapse Mosaics

Maggrite StripesMaggrite Stripes

timetime

Page 24: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

tt

Page 25: 02 Fall09 Lecture Sept18web

Range Camera DemoRange Camera Demo

Page 26: 02 Fall09 Lecture Sept18web
Page 27: 02 Fall09 Lecture Sept18web

http://www.flickr.com/photos/pgoyette/107849943/in/photostream/

Page 28: 02 Fall09 Lecture Sept18web

  Scheimpflug Scheimpflug principleprinciple

Page 29: 02 Fall09 Lecture Sept18web

PlanPlan

• LensesLenses– Point spread functionPoint spread function

• LightfieldsLightfields– What are they?What are they?– What are the properties?What are the properties?– How to capture?How to capture?– What are the applications?What are the applications?

Page 30: 02 Fall09 Lecture Sept18web

• FormatFormat– 4 (3) Assignments4 (3) Assignments

• Hands on with optics, Hands on with optics, illumination, sensors, masksillumination, sensors, masks

• Rolling schedule for overlapRolling schedule for overlap• We have cameras, lenses, We have cameras, lenses,

electronics, projectors etcelectronics, projectors etc• Vote on best projectVote on best project

– Mid term examMid term exam• Test conceptsTest concepts

– 1 Final project1 Final project• Should be a Novel and CoolShould be a Novel and Cool• Conference quality paperConference quality paper• Award for best projectAward for best project

– Take 1 class notesTake 1 class notes– Lectures (and guest Lectures (and guest

talks)talks)– In-class + online In-class + online

discussiondiscussion

• If you are a listenerIf you are a listener– Participate in online discussion, dig Participate in online discussion, dig

new recent worknew recent work– Present one short 15 minute idea or Present one short 15 minute idea or

new worknew work

• CreditCredit• Assignments: 40%Assignments: 40%• Project: 30%Project: 30%• Mid-term: 20%Mid-term: 20%• Class participation: 10%Class participation: 10%

• Pre-reqsPre-reqs• Helpful: Linear algebra, image Helpful: Linear algebra, image

processing, think in 3D processing, think in 3D • We will try to keep math to We will try to keep math to

essentials, but complex conceptsessentials, but complex concepts

Page 31: 02 Fall09 Lecture Sept18web

Assignments:Assignments:You are encouraged to program in Matlab for image analysisYou are encouraged to program in Matlab for image analysisYou may need to use C++/OpenGL/Visual programming for some hardware assignmentsYou may need to use C++/OpenGL/Visual programming for some hardware assignments

Each student is expected to prepare notes for one lectureEach student is expected to prepare notes for one lectureThese notes should be prepared and emailed to the instructor no later than the following These notes should be prepared and emailed to the instructor no later than the following Monday night (midnight EST). Revisions and corrections will be exchanged by email and Monday night (midnight EST). Revisions and corrections will be exchanged by email and after changes the notes will be posted to the website before class the following week.after changes the notes will be posted to the website before class the following week.5 points5 points

Course mailing listCourse mailing list: Please make sure that your emailid is on the course mailing list : Please make sure that your emailid is on the course mailing list Send email to raskar (at) media.mit.eduSend email to raskar (at) media.mit.edu

Please fill in the email/credit/dept sheetPlease fill in the email/credit/dept sheet  Office hoursOffice hours::Email is the best way to get in touchEmail is the best way to get in touchRamesh:. raskar (at) media.mit.eduRamesh:. raskar (at) media.mit.eduAnkit: Ankit: ankit (at) media.mit.eduankit (at) media.mit.edu

After class:After class:Muddy Charles PubMuddy Charles Pub(Walker Memorial next to tennis courts)(Walker Memorial next to tennis courts)

Page 32: 02 Fall09 Lecture Sept18web

2 Sept 18th Modern Optics and Lenses, Ray-matrix operations

3 Sept 25th Virtual Optical Bench, Lightfield Photography, Fourier Optics, Wavefront Coding

4 Oct 2nd Digital Illumination, Hadamard Coded and Multispectral Illumination

5 Oct 9thEmerging Sensors: High speed imaging, 3D  range sensors, Femto-second concepts, Front/back

illumination, Diffraction issues

6Oct 16th

Beyond Visible Spectrum: Multispectral imaging and Thermal sensors, Fluorescent imaging, 'Audio camera'

7 Oct 23rdImage Reconstruction Techniques, Deconvolution, Motion and Defocus Deblurring,

Tomography, Heterodyned Photography, Compressive Sensing

8Oct 30th

Cameras for Human Computer Interaction (HCI): 0-D and 1-D sensors, Spatio-temporal coding, Frustrated TIR, Camera-display fusion

9Nov 6th

Useful techniques in Scientific and Medical Imaging: CT-scans, Strobing, Endoscopes, Astronomy and Long range imaging

10Nov 13th

Mid-term  Exam, Mobile Photography, Video Blogging, Life logs and Online Photo collections

11Nov 20th Optics and Sensing in Animal Eyes. What can we learn from successful biological vision systems?

12Nov 27th Thanksgiving Holiday (No Class)

13Dec 4th Final Projects

Page 33: 02 Fall09 Lecture Sept18web

• What are annoyances in photography ?What are annoyances in photography ?

• Why CCD camera behaves retroreflective?Why CCD camera behaves retroreflective?

• Youtube videos on camera tutorial (DoF etc)Youtube videos on camera tutorial (DoF etc)http://www.youtube.com/user/MPTutorhttp://www.youtube.com/user/MPTutor

Page 34: 02 Fall09 Lecture Sept18web

Anti-Paparazzi FlashAnti-Paparazzi Flash

The anti-paparazzi flash: 1. The celebrity prey. 2. The lurking photographer. 3. The offending camera is detected and then bombed with a beam of light. 4. Voila! A blurry image of nothing much.

Page 35: 02 Fall09 Lecture Sept18web

• Anti-Paparazzi FlashAnti-Paparazzi Flash

Retroreflective CCD of cellphone camera

Preventing Camera Recording by Designing a Capture-Resistant EnvironmentKhai N. Truong, Shwetak N. Patel, Jay W. Summet, and Gregory D. Abowd.Ubicomp 2005

Page 36: 02 Fall09 Lecture Sept18web

Auto FocusAuto Focus

• Contrast method compares contrast of images at three depths,Contrast method compares contrast of images at three depths,

if in focus, image will have high contrast, else notif in focus, image will have high contrast, else not

• Phase methods compares two parts of lens at the sensor plane,Phase methods compares two parts of lens at the sensor plane,

if in focus, entire exit pupil sees a uniform color, else notif in focus, entire exit pupil sees a uniform color, else not

• - assumes object has diffuse BRDF- assumes object has diffuse BRDF

Page 37: 02 Fall09 Lecture Sept18web

Final Project IdeasFinal Project Ideas• User interaction deviceUser interaction device

– Camera basedCamera based

– Illumination basedIllumination based

– Photodetector or line-scan cameraPhotodetector or line-scan camera

• Capture the invisibleCapture the invisible

– Tomography for internalsTomography for internals

– Structured light for 3D scanningStructured light for 3D scanning

– Fluorescence for transparent materialsFluorescence for transparent materials

• Cameras in different EM/other spectrumCameras in different EM/other spectrum

– Wifi, audio, magnetic, haptic, capacitiveWifi, audio, magnetic, haptic, capacitive

– Visible Thermal IR segmentationVisible Thermal IR segmentation

– Thermal IR (emotion detection, motion Thermal IR (emotion detection, motion detector)detector)

– Multispectral camera, discriminating (camel-Multispectral camera, discriminating (camel-sand)sand)

• IlluminationIllumination

– Multi-flash with lighfieldMulti-flash with lighfield

– Schielren photographySchielren photography

– Strobing and Colored strobingStrobing and Colored strobing

• External non-imaging sensorExternal non-imaging sensor– Camera with gyro movement sensors, Camera with gyro movement sensors,

find identity of userfind identity of user– Cameras with GPS and online geo-tagged Cameras with GPS and online geo-tagged

photo collectionsphoto collections– Interaction between two cameras (with Interaction between two cameras (with

lasers on-board)lasers on-board)• OpticsOptics

– LightfieldLightfield– Coded apertureCoded aperture– Bio-inspired visionBio-inspired vision

• TimeTime– Time-lapse photosTime-lapse photos– Motion blurMotion blur

Page 38: 02 Fall09 Lecture Sept18web

Kitchen Sink: Volumetric Scattering

Direct Global

Volumetric Scattering:

Chandrasekar 50, Ishimaru 78

Page 39: 02 Fall09 Lecture Sept18web

“Origami Lens”: Thin Folded Optics (2007)

“Ultrathin Cameras Using Annular Folded Optics, “E. J. Tremblay, R. A. Stack, R. L. Morrison, J. E. FordApplied Optics, 2007 - OSA

Slides by Shree Nayar

Page 40: 02 Fall09 Lecture Sept18web

Origami Lens

ConventionalLens

Origami Lens

Slides by Shree Nayar

Page 41: 02 Fall09 Lecture Sept18web

Fernald, Science [Sept 2006]

Shadow Refractive

Reflective

Tools

for

Visual Computin

g

Page 42: 02 Fall09 Lecture Sept18web

Photonic Crystals• ‘Routers’ for photons instead of electrons

• Photonic Crystal– Nanostructure material with ordered array of holes– A lattice of high-RI material embedded within a lower RI – High index contrast– 2D or 3D periodic structure

• Photonic band gap– Highly periodic structures that blocks certain wavelengths– (creates a ‘gap’ or notch in wavelength)

• Applications– ‘Semiconductors for light’: mimics silicon band gap for electrons– Highly selective/rejecting narrow wavelength filters (Bayer Mosaic?)– Light efficient LEDs– Optical fibers with extreme bandwidth (wavelength multiplexing)– Hype: future terahertz CPUs via optical communication on chip

Page 43: 02 Fall09 Lecture Sept18web

Schlieren Photography

• Image of small index of refraction gradients in a gas• Invisible to human eye (subtle mirage effect)

Knife edge blocks half the light unless

distorted beam focuses imperfectly

Collimated Light

Camera

Page 44: 02 Fall09 Lecture Sept18web

http://www.mne.psu.edu/psgdl/FSSPhotoalbum/index1.htm

Page 45: 02 Fall09 Lecture Sept18web

Sample Final ProjectsSample Final Projects

• Schlieren Photography Schlieren Photography – (Best project award + Prize in 2008)(Best project award + Prize in 2008)

• Camera array for Particle Image VelocimetryCamera array for Particle Image Velocimetry

• BiDirectional ScreenBiDirectional Screen

• Looking Around a Corner (theory)Looking Around a Corner (theory)

• Tomography machineTomography machine

• ....

• ....

Page 46: 02 Fall09 Lecture Sept18web

Computational IlluminationComputational Illumination

• Dual PhotographyDual Photography

• Direct-global SeparationDirect-global Separation

• Multi-flash CameraMulti-flash Camera

Page 47: 02 Fall09 Lecture Sept18web

Ramesh Raskar, Computational Illumination

Computational Illumination

Page 48: 02 Fall09 Lecture Sept18web

Computational Computational PhotographyPhotography

Illumination

Novel CamerasGeneralized

Sensor

Generalized Optics

Processing

4D Light Field

Page 49: 02 Fall09 Lecture Sept18web

Computational IlluminationComputational Illumination

Novel CamerasGeneralized

Sensor

Generalized Optics

Processing

Generalized Optics

Light Sources

Modulators

4D Light Field

Programmable 4D Illumination field + time + wavelength

Programmable 4D Illumination field + time + wavelength

Novel Illumination

Page 50: 02 Fall09 Lecture Sept18web

Edgerton 1930’sEdgerton 1930’s

Not Special Cameras but Special Lighting

Page 51: 02 Fall09 Lecture Sept18web

Edgerton 1930’sEdgerton 1930’s

Multi-flash Sequential Photography

Stroboscope(Electronic Flash)

Shutter Open

Flash Time

Page 52: 02 Fall09 Lecture Sept18web

‘Smarter’ Lighting Equipment

What Parameters Can We What Parameters Can We Change ?Change ?

Page 53: 02 Fall09 Lecture Sept18web

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flashFlash/No-flash

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Synthetic Aperture IlluminationSynthetic Aperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare

Page 54: 02 Fall09 Lecture Sept18web

Multi-flash Camera for Detecting Depth Edges

Page 55: 02 Fall09 Lecture Sept18web

Ramesh Raskar, Karhan Tan, Rogerio Feris, Jingyi Yu, Matthew Turk

Mitsubishi Electric Research Labs (MERL), Cambridge, MAU of California at Santa Barbara

U of North Carolina at Chapel Hill

Non-photorealistic Camera: Non-photorealistic Camera: Depth Edge Detection Depth Edge Detection andand Stylized Stylized

Rendering Rendering usingusing Multi-Flash ImagingMulti-Flash Imaging

Page 56: 02 Fall09 Lecture Sept18web
Page 57: 02 Fall09 Lecture Sept18web

Car Manuals

Page 58: 02 Fall09 Lecture Sept18web

What are the problems with ‘real’ photo in conveying information ?

Why do we hire artists to draw what can be photographed ?

Page 59: 02 Fall09 Lecture Sept18web

Shadows

Clutter

Many Colors

Highlight Shape Edges

Mark moving parts

Basic colors

Page 60: 02 Fall09 Lecture Sept18web

Shadows

Clutter

Many Colors

Highlight Edges

Mark moving parts

Basic colors

A New ProblemA New Problem

Page 61: 02 Fall09 Lecture Sept18web

GesturesGestures

Input Photo Canny Edges Depth Edges

Page 62: 02 Fall09 Lecture Sept18web

Depth Edges with MultiFlashDepth Edges with MultiFlashRaskar, Tan, Feris, Jingyi Yu, Turk – ACM SIGGRAPH 2004

Page 63: 02 Fall09 Lecture Sept18web
Page 64: 02 Fall09 Lecture Sept18web
Page 65: 02 Fall09 Lecture Sept18web
Page 66: 02 Fall09 Lecture Sept18web
Page 67: 02 Fall09 Lecture Sept18web

Depth Discontinuities

Internal and externalShape boundaries, Occluding contour, Silhouettes

Page 68: 02 Fall09 Lecture Sept18web

Depth Edges

Page 69: 02 Fall09 Lecture Sept18web

Our MethodCanny

Page 70: 02 Fall09 Lecture Sept18web

Canny Intensity Edge Detection

Our Method

Photo Result

Page 71: 02 Fall09 Lecture Sept18web
Page 72: 02 Fall09 Lecture Sept18web
Page 73: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Imaging Geometry

Shadow lies along epipolar ray

Page 74: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Shadow lies along epipolar ray,

Epipole and Shadow are on opposite sides of the edge

Imaging Geometry

m

Page 75: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Shadow lies along epipolar ray,

Shadow and epipole are on opposite sides of the edge

Imaging Geometry

m

Page 76: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Depth Edge Camera

Light epipolar rays are horizontal or vertical

Page 77: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Normalized

Left / Max

Right / Max

Left Flash

Right Flash

Input U{depth edges}

Page 78: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Normalized

Left / Max

Right / Max

Left Flash

Right Flash

Input U{depth edges}

Page 79: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Normalized

Left / Max

Right / Max

Left Flash

Right Flash

Input U{depth edges}

Negative transition along epipolar ray is depth edge

Plot

Page 80: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

Normalized

Left / Max

Right / Max

Left Flash

Right Flash

Input

Negative transition along epipolar ray is depth edge

Plot U{depth edges}

Page 81: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Labs Raskar, Tan, Feris, Yu, TurkMultiFlash NPR Camera

% Max composite maximg = max( left, right, top, bottom);

% Normalize by computing ratio imagesr1 = left./ maximg; r2 = top ./ maximg;r3 = right ./ maximg; r4 = bottom ./ maximg;

% Compute confidence mapv = fspecial( 'sobel' ); h = v';d1 = imfilter( r1, v ); d3 = imfilter( r3, v ); % vertical sobeld2 = imfilter( r2, h ); d4 = imfilter( r4, h ); % horizontal sobel

%Keep only negative transitions silhouette1 = d1 .* (d1>0); silhouette2 = abs( d2 .* (d2<0) );silhouette3 = abs( d3 .* (d3<0) );silhouette4 = d4 .* (d4>0);

%Pick max confidence in eachconfidence = max(silhouette1, silhouette2, silhouette3, silhouette4);imwrite( confidence, 'confidence.bmp');

No magicparameters

!

Page 82: 02 Fall09 Lecture Sept18web

Depth Edges

Left Top Right Bottom

Depth EdgesCanny Edges

Page 83: 02 Fall09 Lecture Sept18web

GesturesGestures

Input Photo Canny Edges Depth Edges

Page 84: 02 Fall09 Lecture Sept18web
Page 85: 02 Fall09 Lecture Sept18web
Page 86: 02 Fall09 Lecture Sept18web

Flash MattingFlash Matting

Flash Matting, Jian Sun, Yin Li, Sing Bing Kang, and Heung-Yeung Shum, Siggraph 2006

Page 87: 02 Fall09 Lecture Sept18web
Page 88: 02 Fall09 Lecture Sept18web

Multi-light Image CollectionMulti-light Image Collection[Fattal, Agrawala, Rusinkiewicz] Sig’2007

Input Photos ShadowFree,Enhanced

surface detail,

but Flat look

Some Shadows for depth

but Lost visibility

Page 89: 02 Fall09 Lecture Sept18web

Multiscale decomposition using Bilateral Filter,Combine detail at each scale across all the input images.

Fuse maximum gradient from each photo, Reconstruct from 2D integrationall the input images.

Enhanced shadows

Page 90: 02 Fall09 Lecture Sept18web

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flash (matting for foreground/background)Flash/No-flash (matting for foreground/background)

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Dual Photography, Direct/Global Separation, Synthetic Dual Photography, Direct/Global Separation, Synthetic Aperture IlluminationAperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare

Page 91: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Dual Photography

Pradeep Sen, Billy Chen, Gaurav Garg, Steve MarschnerMark Horowitz, Marc Levoy, Hendrik Lensch

Stanford University Cornell University

*

*

August 2, 2005

Los Angeles, CA

Page 92: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

The card experiment

book

camera

card

projector

primal

Page 93: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

The card experiment

primal

dual

Page 94: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Overview of dual photography

standard photographfrom camera

dual photographfrom projector

Page 95: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Outline

1. Introduction to dual photography

2. Application to scene relighting

3. Accelerating acquisition

4. Conclusions

Page 96: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Helmholtz reciprocity

light

eye

eye

light I

I

I

I

primaldual

scene

projector

photosensor

primal

Page 97: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Helmholtz reciprocity

projector

photosensor

projector

photosensor

scene

light

cameraprimaldual

Page 98: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Forming a dual image

projector

photosensor C0

C1

C2

C3

C4

C5

C6

C7

light

camera

scene

primaldual

Page 99: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Forming a dual image

scene

C0

C1

C2

C3

C4

C5

C6

C7

light

cameradual

Page 100: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Physical demonstration

Projector was scanned across a scene while a photosensor measured the outgoing light

photosensor

resulting dual image

Page 101: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Related imaging methods

Example of a “flying-spot” camera built at the dawn of TV (Baird 1926)

Scanning electron microscope

Velcro® at 35x magnification,Museum of Science, Boston

Page 102: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Dual photography for relighting

p

q n

m

projector cameradual camera dual projector

4D

projectorphotosensor

primal

scene

dual

Page 103: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Mathematical notation

scene

p

q n

m

projector camera

P

pq x 1

C

mn x 1

T

mn x pq

primal

Page 104: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Mathematical notation

P

pq x 1

C

mn x 1

T

mn x pq

=

primal

Page 105: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

=

pq x 1mn x 1

T

mn x pq

Mathematical notation

100000

Page 106: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

=

pq x 1mn x 1

T

mn x pq

Mathematical notation

010000

Page 107: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

=

pq x 1mn x 1

T

mn x pq

Mathematical notation

001000

Page 108: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

= T

mn x pq

Mathematical notation

P

pq x 1

C

mn x 1little interreflection → sparse matrix

many interreflections → dense matrix

Page 109: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

?

Mathematical notation

T= P

pq x 1

C

mn x 1

mn x pq

primal space

dual space =

pq x mn

C

mn x 1

P

pq x 1

j

i

i

j

Tij

T

Tji

T = TT

T = Tji ij

Page 110: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Definition of dual photography

= T

mn x pq

primal space

dual space =

pq x mn

TT

T

mn x pq

mn x 1

C

pq x 1

P

pq x 1

P

mn x 1

C

Page 111: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Sample results

primal dual

Page 112: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Sample results

primal dual

Page 113: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Scene relighting

Knowing the pixel-to-pixel transport between the projector and the camera allows us to relight the scene with an arbitrary 2D pattern

primal dual

Page 114: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Photosensor experiment

= T

mn x pq

pq x 1

P

mn x 1

C C T

1 x pq

primal space

dual space

T

1 x pq

=

pq x 1

P

pq x 1

TT

C

Page 115: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

2D relighting videos

Relighting book scene with animated patterns

Relighting box with animated pattern

Page 116: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Relighting with 4D incident light fields

2D 4D

6D Transport

Page 117: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

From Masselus et al. SIGGRAPH ‘03

Relighting with 4D incident light fields

Page 118: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Relighting with 4D incident light fields

2D 4D

6D Transport

Page 119: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Relighting with 4D incident light fields

2D 4D

6D Transport

Page 120: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Acquisition of transport from multiple projectors cannot be parallelized

Acquisition of transport using multiple cameras can!

Advantages of our dual framework

Page 121: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

“Multiple” cameras with mirror array

Page 122: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Conclusions

Dual photography is a novel imaging technique that allows us to interchange the camera and the projector

Dual photography can accelerate acquisition of the 6D transport for relighting with 4D incident light fields

We developed an algorithm to accelerate the acquisition of the transport matrix for dual photography

Page 123: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

The card experiment

book

camera

card

projector

primal

Page 124: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

The card experiment

primal

dual

Page 125: 02 Fall09 Lecture Sept18web

Los Angeles, CA August 2, 2005 Pradeep Sen

Photosensor experiment

= T

mn x pq

pq x 1

P

mn x 1

C C T

1 x pq

primal space

dual space

T

1 x pq

=

pq x 1

P

pq x 1

TT

C

Page 126: 02 Fall09 Lecture Sept18web

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flash (matting for foreground/background)Flash/No-flash (matting for foreground/background)

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Dual Photography, Direct/Global Separation, Synthetic Dual Photography, Direct/Global Separation, Synthetic Aperture IlluminationAperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare

Page 127: 02 Fall09 Lecture Sept18web

Visual Chatter in the Real World

Shree K. Nayar

Computer ScienceColumbia University

With: Guru Krishnan, Michael Grossberg, Ramesh Raskar

Eurographics Rendering SymposiumJune 2006, Nicosia, Cyprus

Support: ONR

Page 128: 02 Fall09 Lecture Sept18web

source

surface

P

Direct and Global Illumination

A

A : Direct

B

B : Interrelection

C

C : Subsurface

D

participating medium

D : Volumetric translucent surface

E

E : Diffusion

camera

Page 129: 02 Fall09 Lecture Sept18web

Direct Global

Shower Curtain: Diffuser

Page 130: 02 Fall09 Lecture Sept18web

],[],[],[ icLicLicL gd

direct globalradiance

Direct and Global Components: Interreflections

surface

i

camera

source

P

g jiLjiAicL ],[],[],[

j

BRDF and geometry

Page 131: 02 Fall09 Lecture Sept18web

High Frequency Illumination Pattern

surface

camera

source

fraction of activated source elements

],[],[],[ icLicLicL gd +

i

Page 132: 02 Fall09 Lecture Sept18web

High Frequency Illumination Pattern

surface

fraction of activated source elements

camera

source

],[],[],[ icLicLicL gd + - ],[],[ icLicL g )1(

i

Page 133: 02 Fall09 Lecture Sept18web

:2

1 min2LLg

Separation from Two Images

direct global

,minmax LLLd

Page 134: 02 Fall09 Lecture Sept18web

Other Global Effects: Volumetric Scattering

surface

camera

source

participating medium

i

j

Page 135: 02 Fall09 Lecture Sept18web

Diffuse Interreflections

SpecularInterreflections

Volumetric Scattering Subsurface

Scattering

Diffusion

Page 136: 02 Fall09 Lecture Sept18web

Scene

Direct Global

Page 137: 02 Fall09 Lecture Sept18web

F G C A D B E

A: Diffuse Interreflection (Board)

B: Specular Interreflection (Nut)

C: Subsurface Scattering (Marble)

D: Subsurface Scattering (Wax)

E: Translucency (Frosted Glass)

F: Volumetric Scattering (Dilute Milk)

G: Shadow (Fruit on Board)

Verification

Page 138: 02 Fall09 Lecture Sept18web
Page 139: 02 Fall09 Lecture Sept18web

V-Grooves: Diffuse Interreflections

Direct Global

concave convex

Psychophysics:

Gilchrist 79, Bloj et al. 04

Page 140: 02 Fall09 Lecture Sept18web

Mirror Ball: Failure Case

Direct Global

Page 141: 02 Fall09 Lecture Sept18web

Real World Examples:

Can You Guess the Images?

Page 142: 02 Fall09 Lecture Sept18web

Eggs: Diffuse Interreflections

Direct Global

Page 143: 02 Fall09 Lecture Sept18web

Stick

Building Corner

Shadow

minLLg

direct global

,minmax LLLd

3D from Shadows:

Bouguet and Perona 99

Page 144: 02 Fall09 Lecture Sept18web

Direct Global

Building Corner

Page 145: 02 Fall09 Lecture Sept18web

Shower Curtain: Diffuser

Shadow

Mesh

minLLg

direct global

,minmax LLLd

Page 146: 02 Fall09 Lecture Sept18web

Direct Global

Shower Curtain: Diffuser

Page 147: 02 Fall09 Lecture Sept18web
Page 148: 02 Fall09 Lecture Sept18web

Kitchen Sink: Volumetric Scattering

Direct Global

Volumetric Scattering:

Chandrasekar 50, Ishimaru 78

Page 149: 02 Fall09 Lecture Sept18web

Novel Image

Page 150: 02 Fall09 Lecture Sept18web

Real or Fake ?

Direct Global

R

F

RF

R

F

Page 151: 02 Fall09 Lecture Sept18web

Hand

Direct Global

Skin: Hanrahan and Krueger 93,

Uchida 96, Haro 01, Jensen et al. 01,

Igarashi et al. 05, Weyrich et al. 05

Page 152: 02 Fall09 Lecture Sept18web

Hands

Afric. Amer.Female

ChineseMale

SpanishMale

Direct Global

Afric. Amer.Female

ChineseMale

SpanishMale

Afric. Amer.Female

ChineseMale

SpanishMale

Page 153: 02 Fall09 Lecture Sept18web

Pink Carnation

GlobalDirect

Spectral Bleeding: Funt et al. 91

Page 154: 02 Fall09 Lecture Sept18web
Page 155: 02 Fall09 Lecture Sept18web

Summary

• Fast and Simple Separation Method

• Wide Variety of Global Effects

• No Prior Knowledge of Material Properties

• Implications:

• Generation of Novel Images

• Enhance Computer Vision Methods

• Insights into Properties of Materials

Page 156: 02 Fall09 Lecture Sept18web

+=

+=

+=

+=

+=

+=

+

+

=

=

+=

+=

www.cs.columbia.edu/CAVE

Page 157: 02 Fall09 Lecture Sept18web

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flash (matting for foreground/background)Flash/No-flash (matting for foreground/background)

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Dual Photography, Direct/Global Separation, Synthetic Dual Photography, Direct/Global Separation, Synthetic Aperture IlluminationAperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare

Page 158: 02 Fall09 Lecture Sept18web

Day of the year

Time of day

Page 159: 02 Fall09 Lecture Sept18web

The Archive of Many Outdoor Scenes (AMOS)

Images from ~1000 static webcams, every 30 minutes since March 2006.

Variations over a year and over a day

Jacobs, Roman, and Robert Pless, WUSTL CVPR 2007,

Page 160: 02 Fall09 Lecture Sept18web

Analysing Time Lapse Images

• PCA– Linear Variations due to lighting and seasonal variation

• Decompose (by time scale)– Hour: haze and cloud for depth.

– Day: changing lighting directions for surface orientation

– Year: effects of changing seasons highlight vegetation

• Applications:– Scene segmentation.– Global Webcam localization. Correlate timelapse

video over a month from unknown camera with:• sunrise + sunset (localization accuracy ~ 50 miles)

• Known nearby cameras (~25 miles)

• Satellite image (~15 miles)

Mean image + 3 components from time lapse of downtown st. louis over the course of 2 hours

Page 161: 02 Fall09 Lecture Sept18web

2 Hour time Lapse in St Louis:Depth from co-varying regions

Page 162: 02 Fall09 Lecture Sept18web

Surface Orientation

False Color PCA images

Page 163: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Image Fusion for Context Enhancementand Video Surrealism

Adrian IlieAdrian IlieRamesh RaskarRamesh Raskar Jingyi YuJingyi Yu

Page 164: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

Dark Bldgs

Reflections on bldgs

Unknown shapes

Page 165: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, Yu

‘Well-lit’ Bldgs

Reflections in bldgs windows

Tree, Street shapes

Page 166: 02 Fall09 Lecture Sept18web

Mitsubishi Electric Research Laboratories Image Fusion for Context Enhancement Raskar, Ilie, YuBackground is captured from day-time scene using the same fixed camera

Night Image

Day Image

Context Enhanced Image

http://web.media.mit.edu/~raskar/NPAR04/

Page 167: 02 Fall09 Lecture Sept18web

Ramesh Raskar, CompPhoto

Factored Time Lapse Video

Factor into shadow, illumination, and reflectance. Relight, recover surface normals, reflectance editing.

[Sunkavalli, Matusik, Pfister, Rusinkiewicz], Sig’07

Page 168: 02 Fall09 Lecture Sept18web

Ramesh Raskar, CompPhoto

Page 169: 02 Fall09 Lecture Sept18web

Computational Illumination:Computational Illumination:Programmable 4D Illumination Field + Time + WavelengthProgrammable 4D Illumination Field + Time + Wavelength

• Presence or Absence, Duration, BrightnessPresence or Absence, Duration, Brightness– Flash/No-flash (matting for foreground/background)Flash/No-flash (matting for foreground/background)

• Light positionLight position– Relighting: Programmable domeRelighting: Programmable dome– Shape enhancement: Multi-flash for depth edgesShape enhancement: Multi-flash for depth edges

• Light color/wavelengthLight color/wavelength• Spatial ModulationSpatial Modulation

– Dual Photography, Direct/Global Separation, Synthetic Dual Photography, Direct/Global Separation, Synthetic Aperture IlluminationAperture Illumination

• Temporal ModulationTemporal Modulation– TV remote, Motion Tracking, Sony ID-cam, RFIGTV remote, Motion Tracking, Sony ID-cam, RFIG

• Exploiting (uncontrolled) natural lighting conditionExploiting (uncontrolled) natural lighting condition– Day/Night Fusion, Time Lapse, GlareDay/Night Fusion, Time Lapse, Glare