moveable interactive projected displays using projector based tracking

46
Moveable Interactive Projected Displays Using Projector Based Tracking 1 Carnegie Mellon University 2 Mitsubishi Electric Research Labs 3 Georgia Tech University Seattle, WA UIST 2005 Johnny C. Lee 1,2 Scott E. Hudson 1 Jay W. Summet 3 Paul H. Dietz 2

Upload: duyen

Post on 11-Jan-2016

44 views

Category:

Documents


0 download

DESCRIPTION

Moveable Interactive Projected Displays Using Projector Based Tracking. 1 Carnegie Mellon University 2 Mitsubishi Electric Research Labs 3 Georgia Tech University Seattle, WA UIST 2005. Johnny C. Lee 1,2 Scott E. Hudson 1 Jay W. Summet 3 Paul H. Dietz 2. - PowerPoint PPT Presentation

TRANSCRIPT

Moveable Interactive Projected Displays Using Projector Based Tracking

1Carnegie Mellon University2Mitsubishi Electric Research Labs3Georgia Tech University

Seattle, WA UIST 2005

Johnny C. Lee1,2

Scott E. Hudson1

Jay W. Summet3

Paul H. Dietz2

UIST 2004 – Automatic Projector Calibration

1. Embedded light sensor in surface.

2. Project patterns to find sensor locations.

3. Pre-warp source image to fit surface.

(video clip 1)

Correspondence between location data between and projected image is free (e.g. no need for calibration with external tracking system)

Transforms passive surfaces into active displays in a practical manner.

Variety of useful applications

Touch Calibration

Everywhere Displays, IBM

Shader Lamps, MERL/UNC

Diamond Touch, MERL

Interactive Whiteboard

Projector-based AR

Focus on Moveable Projected Displays

Goals of this work:1. Achieve interactive tracking rates for hand-held surfaces.2. Reduce the perceptibility of the location discovery patterns.3. Explore interaction techniques supported by this approach.

Display Surface

Constructed from foam core and paper

Touch-sensitivity is provided by a resistive film

Lighter than a legal pad

Tablet PC-like Interaction

Video clip 2

Talk Outline

• Reducing Perceptibility

• Achieving Interactive Rates

• Pattern Size and Shape

• Tracking Loss

• Interaction Techniques/Demos

Talk Outline

• Reducing Perceptibility

• Achieving Interactive Rates

• Pattern Size and Shape

• Tracking Loss

• Interaction Techniques/Demos

Gray Code Patterns

Black and White

Difference is visible to the human eye

Gray Code Patterns

Black and White

Difference is visible to the human eye

Gray Code Patterns

Black and White

Difference is visible to the human eye

Gray Code Patterns

Black and White

Difference is visible to the human eye

Gray Code Patterns

Black and White

Difference is visible to the human eye

Gray Code Patterns

Black and White Frequency Shift Keyed (FSK)

HF LF

Difference is visible to the human eye

Difference is NOT visible to the human eye

Gray Code Patterns

Black and White Frequency Shift Keyed (FSK)

HF LF HF

Difference is visible to the human eye

Difference is NOT visible to the human eye

HF HFLF HF LF

Gray Code Patterns

Black and White Frequency Shift Keyed (FSK)

Difference is visible to the human eye

Difference is NOT visible to the human eye

FSK Transmission of PatternsFSK transmission of the Gray Code patterns makes the

stripped region boundaries invisible to the human eye.Patterns appear to be solid grey squares to observers.Light sensor is able to demodulate the HF and LF

regions into 0’s and 1’sThis is accomplished using a modified DLP projector

Inside a DLP projector

DLP = Digital Light Processing • Many consumer projectors currently use DLP technology• “DLP” is Texas Instruments marketing term for DMD

DMD = Digital Micro-mirror Device

• Each mirror corresponds to a pixel• Brightness corresponds to duty cycle of mirror

Pictures from Texas Instruments literature

Inside a DLP projector

Light source

Color wheel

DMD

Projector Lens

Inside our modified DLP projector

DMD

Projector Lens

Light source

FSK Transmission of Location Patterns

• Removing the color wheel flattens the color space of the projector into a monochrome scale

• Multiple points in the former color space now have the same apparent intensity to a human observer, but are manifested by differing signals.

• The patterns formerly known as “red” and “grey” are rendered as 180Hz and 360Hz signals respectively.

• Monochrome projector is not ideal, but is a proof of concept device until we can build a custom DMD projector.

Talk Outline

• Reducing Perceptibility

• Achieving Interactive Rates

• Pattern Size and Shape

• Tracking Loss

• Interaction Techniques/Demos

Projector Specifications

Infocus X1 (~$800 new)– 800x600 SVGA resolution– 1 DMD chip– 60Hz refresh rate

Full-Screen Location Discovery Time:– 20 images ( log2(# pixels) )

– 333ms at 60Hz

3Hz maximum update rate

Incremental TrackingProject small tracking patterns over the last

known locations of each sensor for incremental offsets

Black masks reduce visibility of tracking patterns

Tracking loss strategies are needed (later)

Smaller area = fewer patterns = faster updates– 32x32 unit grid requiring 10 images– 6Hz update rate

Tracking Demo

Video clip 3

Latency and Interleaving Incremental tracking is a tight feedback loop:

project update project update …

6Hz update rate assumes 100% utilization of the 60 frames/sec the projector can display

System latencies negatively impact channel utilization

Achieving 100% utilization of the projection channel requires taking advantage of the axis-independence of Gray Code patterns.

System Latency – Full X-Y Tracking

X patterns Y patternsProjection:

Software:draw

X-Y patternsupdate& draw

X-Y patterns

X patterns Y patterns

Time

Only 73% utilization

Hardware &OS scheduling

Graphics& Video

System Latency - Interleaved Tracking

X patterns Y patternsProjection:

Software:draw

X patternsupdate& draw

X patterns

Time

drawY patterns

X patterns Y patterns

update& draw

Y patterns

update& draw

X patterns

100% utilization of the projection channel and 12Hz interleaved update

Talk Outline

• Reducing Perceptibility

• Achieving Interactive Rates

• Pattern Size and Shape

• Tracking Loss

• Interaction Techniques/Demos

Tracking Pattern Size

Tracking Area Tracking Rate

32x32 grid 12Hz interleaved

16x16 grid 15Hz interleaved

-75% area +25% rate

Smaller tracking area increases risk of losing sensor (e.g. maximum supported velocity)

log2 relationship makes it hard to gain speed though the use of smaller patterns

Tracking Pattern Size

Pixel DensityDecreases

Tracking Pattern Size

large, coarsetracking pattern small, fine

tracking pattern

Preserves physical size of tracking pattern (cm)Preserves maximum supported velocity (m/s)

Distance is approximated from screen sizeScaling factor is adjustable (precision vs. max velocity): ~2.5mm; 25cm/s

Motion Modeling

v

a

Predicting the motion can be used to increase the range of supported movement (e.g. max acceleration vs. max velocity)

Much of the work in motion modeling is applicable. But, no model is perfect and mis-predictions can lead to tracking loss potentially yielding poorer overall performance.

Models are likely to be application and implementation specific.

Tracking Pattern Shape

We used square tracking patterns due to the axis aligned nature of Gray code patterns.

Patterns with high-radial symmetry are best for general movement in two-dimensions.

Pattern geometry can be optimized for specific applications.

Talk Outline

• Reducing Perceptibility

• Achieving Interactive Rates

• Pattern Size and Shape

• Tracking Loss

• Interaction Techniques/Demos

Detecting Occlusions or Tracking Loss

Causes of tracking loss:

1. occlusions

2. exiting the projection area

3. exceeding the range of motion supported by the tracking patterns

With FSK transmission, tracking loss corresponds to a disappearance of the carrier signal. This allows error detection on a per-bit basis.

Implemented on a low-cost PIC processor as:

1. sudden drop in signal amplitude

2. insufficient amplitude

3. invalid edge count

Lost Tracking Behavior

Single/independent sensors:1. Discard and hope the sensor has not moved2. Perform a full screen discovery process (+333ms)3. Grow the tracking pattern around last location until

reacquired

Multiple sensors of known geometric relationship:1. Try the above three techniques. 2. Compute predicted lost sensor locations using the

locations of the remaining available sensors.

Tracking Loss With Multiple Sensors

video clip 4

Estimating Lost Sensors

Available Sensors

Action

4/4 No estimation needed, compute the 4 point warping homography directly

3/4Measure 6 offsets, compute affine transform to estimate lost sensor from last known location

2/4 Measure 4 offsets, compute simplified transform to estimate lost sensor locations

1/4 Measure 2 offsets, compute 2D translation

0/4 Try full screen discovery

Note: Transformations for each point cannot be implemented as a simple matrix stack because LIFO ordering of sensor loss and re-acquisition is not guaranteed.

Talk Outline

• Reducing Perceptibility

• Achieving Interactive Rates

• Pattern Size and Shape

• Tracking Loss

• Interaction Techniques/Demos

Supported Interaction Techniques

Video clip 5

Supported Interaction Techniques

Magic Lens Focus + Context

Location Aware Displays Input Pucks

Simulated Tablet PC

Conclusion

Unifying the tracking and projection technology greatly simplifies the implementation and execution of applications that combine motion tracking with projected imagery.– Coherence between the location data and projected image is

free.– Does not require an external tracking system or calibration– Simple: Demos were created in about a week

This approach has the potential to change the economics of interactive displays– The marginal cost of each display can be as low as $10 USD– Museum: wireless displays could be handed out to visitors.– Medical Clinic: physical organization of patient charts/folders

Future Work Removing the color wheel was a proof-of-concept work

around. Construct high-speed projector using a DLP development

kit Explore using infrared to project invisible patterns Explore other applications where low-speed positioning is

sufficient. Achieve +18Hz (+36Hz interleaved) tracking with visible

patterns and an unmodified DLP projector using RGB sections.

Using multiple projectors (or steerable projectors) to increase freedom of movement.

Acknowledgements

Johnny Chung Lee

[email protected]

Funded in part by the National Science Foundation under grants IIS-0121560 and IIS-0325351

Funded in part by Mitsubishi Electric Research Labs

Technical Details Infocus X1 ($800) 800x600, 60HzPIC16F819 at 20Mhz, 10bit ADCSensor package < $10 in volume4-wire resistive touch sensitive film IF-D92 fiber optic phototransistors45Bytes/sec for location data25mW during active tracking Latency (77ms – 185ms)