3d interaction using hand motion...

30
3D Interaction using Hand Motion Tracking Srinath Sridhar Antti Oulasvirta EIT ICT Labs Smart Spaces Summer School 05-June-2013

Upload: buicong

Post on 12-Jun-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

3D Interaction using Hand Motion Tracking

Srinath Sridhar

Antti Oulasvirta

EIT ICT Labs Smart Spaces Summer School

05-June-2013

Speaker

Srinath Sridhar PhD Student

Supervised by Prof. Dr. Christian Theobalt and Dr. Antti Oulasvirta

Max Planck Institut für Informatik, Saarbrücken, Germany

www.mpi-inf.mpg.de/~ssridhar/

Antti Oulasvirta Senior Researcher

Max Planck Institut für Informatik, Saarbrücken, Germany

www.mpi-inf.mpg.de/~oantti/

05-June-2013 3D Interaction using Hand Motion Tracking 2

Overview of Today’s Session

• We will have four parts.

– Part I : You are here!

– Part II : Introduction to 3D Interaction using Hand Motion

Tracking

– Part III : Introduction to the Leap Motion Sensor and SDK

– Part IV : Hands-on Exercises

• Please feel free to interrupt with questions anytime…

05-June-2013 3D Interaction using Hand Motion Tracking 3

Requirements for Today’s Session

• Requirements

– WiFi enabled laptop

– Laptop with WebSocket compatible web browser (Firefox 6+,

Chrome 14+, IE 10)

– Text editor and basic Java/C++ skills

– Google Earth for Windows or Mac

– Cool 3D interaction ideas

• Audience poll

– Requirements

– Teams

05-June-2013 3D Interaction using Hand Motion Tracking 4

Objectives

• Gain the ability to understand and create 3D interactive

interfaces using hand motion tracking

– Computer vision techniques for hand motion tracking and their

relative performance

– Different sensing devices with emphasis on the Leap Motion

sensor and the Leap SDK

– Implement a simple 3D interaction interface for Google Earth

05-June-2013 3D Interaction using Hand Motion Tracking 5

INTRODUCTION TO 3D INTERACTION

USING HAND MOTION TRACKING

Part II

6

Motivation

05-June-2013 3D Interaction using Hand Motion Tracking 7

Motivation – The Human Hand

• Joints – 26 degrees-of-freedom

•Muscles – fine motor control

•Brain – Grasping and gestures

05-June-2013 3D Interaction using Hand Motion Tracking 8

Motivation – Potential HCI Applications

“Tony Stark”/”Tom Cruise” –esque interface of the future…

05-June-2013 3D Interaction using Hand Motion Tracking 9

Retargeting

Sign Language

Recognition

2D/3D UI

Interaction

Musical

Instrument

Components of 3D Interaction using Hand Tracking

05-June-2013 3D Interaction using Hand Motion Tracking 10

Computer Human

Articulated

Hand Motion

Tracking (Output: Set of points,

skeleton, etc.)

3D Interaction

Interface (3D Desktop, Google Earth,

etc.)

Interaction Design

(Part III & IV)

Computer Vision

(Part II)

Requirements for Hand Tracking in HCI

• Interactive: Real-time performance and minimum latency

• Markerless: Not use gloves or markers

• DoF: Capture many degrees-of-freedom or hand skeleton

• Occlusions: Robust to partial self-occlusions

• Environment: General background and illumination

05-June-2013 3D Interaction using Hand Motion Tracking 11

Leap Motion

• Tracking semantically meaningful parts of the hand each with

6 DoF (fingertips, palm)

• Very high accuracy and low latency

• Internally uses a depth sensor

• No skeleton tracking

05-June-2013 3D Interaction using Hand Motion Tracking 12

Efficient model-based 3D tracking of hand

articulations using Kinect Oikonomidis et al. (ICCV 2011, CVPR 2012)

• Captures 26 DoF of the hand using a model composed of geometric primitives

• Performance - 15 Hz; Latency due to Kinect

• Limited to range of the Kinect

• Skin colour-based segmentation of depth data

05-June-2013 13 3D Interaction using Hand Motion Tracking

6D Hands: Markerless Hand-Tracking for Computer

Aided Design Wang et al. (UIST 2011)

05-June-2013 3D Interaction using Hand Motion Tracking 14

• Captures 27 DoF of

the hand using a

skeleton hand model

• Performance - 17 Hz

• Skin colour-based

segmentation of depth

data

• Used as a control

interface for 3D CAD

Modelling

Hybrid Hand Tracking using RGB and Depth Data MPI Informatik

05-June-2013 3D Interaction using Hand Motion Tracking 15

• Captures 26 DoF of the

hand using a kinematic skeleton model

• Performance - 17 Hz. 30-60 ms latency

• Uses colour information from RGB cameras and depth data

• Multi-view camera setup with 5 RGB and 1 Depth camera

• Interface for musical expression

Hand Tracking – Approach

07-Feb-2013 CG Lunch

Voting

Multi-view Image

Sequence

Depth Data

Normalization

Feature

Extraction

Database of Hand Poses

Final Pose

16

Comparison of Hand Motion Tracking Systems

05-June-2013 3D Interaction using Hand Motion Tracking 17

System Interactive No. of DoF Accuracy Technology Number of

Views

HCI

Application

Leap

Motion

20 fps

Low

0-36+

No

articulation

s

High Depth 1 Google

Earth, 3D

UI, etc.

ICS

FORTH

15 fps

High

26 10mm Depth +

RGB

1 Object

interaction

Wang et al. 15 fps 27 - RGB (also

depth)

2 (also 1) 3D CAD

Modelling

MPI 17 fps 26 13mm Depth +

RGB

4-6 Musical

Instrument

ETH

Zurich

2 fpm 26+ ~10-15mm RGB 7-8 Multiple

hands

Intel 50 fps ~26 - Depth 1 -

INTRODUCTION TO SENSING DEVICES

AND THE LEAP MOTION SDK

Part III

18

What is the Leap Motion controller?

• A close range depth sensor

– Range < 50cm

– Similar to Microsoft Kinect, Softkinetic Depthsense, etc.

• Bundled API for tracking

– Fingertips

– Hands

– Tools (any pointy object)

• USB 2.0/3.0 input

• Available in June/July for $70

• Air Space app store

05-June-2013 19 3D Interaction using Hand Motion Tracking

How does it (most likely) work?

• Possibly time-of-flight with stereo

05-June-2013 20 3D Interaction using Hand Motion Tracking

Structured light (Kinect)

Leap Motion

Functionality Exposed in API

• Hands

– Palm center and orientation

• Fingers

– Fingertip location

– Finger length (not exact)

– Finger pointing direction

• Tools (any pointy object)

– Tooltip location

– Tool length

– Tool pointing direction

https://developer.leapmotion.com/documentation/guide/Leap_Overview

05-June-2013 21 3D Interaction using Hand Motion Tracking

Pros and Cons of the Leap Motion

• Jitter-free point tracking

• High frame rate

• Low latency

• Fairly large FOV

• No skeleton tracking

– Tracked points have no

semantics

• No access to raw data

– Depth data

– RGB data (if available)

• Single viewpoint

05-June-2013 3D Interaction using Hand Motion Tracking 22

Other Depth Sensors

05-June-2013 3D Interaction using Hand Motion Tracking 23

BREAK?

HANDS-ON EXERCISES

Part IV

25

Information

• Connect to WiFi

– SSID: minerva

– Password: 3dinteraction

• Please install Google Earth if you have not.

• Google Earth API – basics are enough

• Visit: 192.168.1.100:8080

• You should see this:

05-June-2013 3D Interaction using Hand Motion Tracking 26

Exercises Overview

• Implement panning and zooming using one of the following.

– Datastructure from the Leap Motion SDK

– 3D Position Data from Intel Depth Tracker

• Implement “flying” at the terrain level using one of the

following.

– Datastructure from the Leap Motion SDK (Hint: think about

the palm)

– 6D Position Data from Intel Depth Tracker

• Bonus

– Panning with clutching

05-June-2013 3D Interaction using Hand Motion Tracking 27

Exercises

• Implement panning and zooming using one of the following

– Datastructure from the Leap Motion SDK

– 3D Position Data from Intel Depth Tracker

05-June-2013 3D Interaction using Hand Motion Tracking 28

WRAP-UP

29

Conclusion

• Feedback

• Contact

– Srinath: [email protected]

– Antti: [email protected]

05-June-2013 3D Interaction using Hand Motion Tracking 30