eu funded fp7: oct 11 – sep 14

23
EU funded FP7: Oct 11 – Sep 14 Co-evolution of Future AR Mobile Platforms Paul Chippendale, Bruno Kessler Foundation FBK, Italy

Upload: fleta

Post on 23-Feb-2016

26 views

Category:

Documents


0 download

DESCRIPTION

Co-evolution of Future AR Mobile Platforms. EU funded FP7: Oct 11 – Sep 14. Paul Chippendale, Bruno Kessler Foundation FBK, Italy. Move away from the Augmented Keyhole. User centric, not device centric. HMDs lock displays to the viewer . But what about handheld displays?. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: EU funded FP7: Oct 11 – Sep 14

EU funded FP7: Oct 11 – Sep 14

Co-evolution of Future AR Mobile Platforms

Paul Chippendale, Bruno Kessler Foundation FBK, Italy

Page 2: EU funded FP7: Oct 11 – Sep 14

Move away from the Augmented Keyhole

Page 3: EU funded FP7: Oct 11 – Sep 14

User centric, not device centric

HMDs lock displays to the viewer

But what about handheld displays?

Page 4: EU funded FP7: Oct 11 – Sep 14

Device-World registration

What is the device’s real-world location?Which direction is it pointing?

Page 5: EU funded FP7: Oct 11 – Sep 14

Device-World registrationWhat is the device’s real-world location?

GPS, Cell/WiFi tower triangulation (~10m)

Page 6: EU funded FP7: Oct 11 – Sep 14

Device-World registrationWhich direction is it pointing?

Magnetometer, Gyros, Accelerometers (~5-20º)

Mems production variability Sensors age Soft/Hard iron influences vary across

devices, environments and camera pose

Page 7: EU funded FP7: Oct 11 – Sep 14

Is +/- 10m and +/- 20º sufficient for nailed-down AR?

Page 8: EU funded FP7: Oct 11 – Sep 14

But what about hand-held AR?

Devices becomes an augmented window

Page 9: EU funded FP7: Oct 11 – Sep 14

User-Device-World registration

What is the device’s real-world location?Which direction is it pointing?

Where is the user with respect to the screen?

Page 10: EU funded FP7: Oct 11 – Sep 14

Surely if we wait sensor errors will disappear?Unlikely!O Sensor errors are tolerable for non-AR

application, handset manufacturers focus on price, power and form-factor

Can’t we just model the error in software?Not really!O Platform diversity and swift evolution make error

modelling expensive and quickly obsolete

Just wait for better AR devices!

Page 11: EU funded FP7: Oct 11 – Sep 14

So what can we do?

The AR comunity should work with handset manufacturers and make recomendations

Use computer vision to work with sensors

Page 12: EU funded FP7: Oct 11 – Sep 14

VENTURI project...o Match AR requirements to

platformo Efficiently exploit CPUs &

GPUs o Improving sensor-camera

fusion by creating a common clock (traditionally only audio/video considered)

o Applying smart power management policieso Optimizing AR chain, by exploiting both on-board and cloud processing/storage

Page 13: EU funded FP7: Oct 11 – Sep 14

Seeing the worldo Improve device-world pose

by:• Matching visual features to

3D models of the world• Matching camera feed to

visual appearance of the world

• Fusing camera and sensors for ambiguity reasoning and tracking

o Use front facing camera to estimate user-device pose via face tracking

Page 14: EU funded FP7: Oct 11 – Sep 14

Urban 3D Model matchingo Use high

resolution building models (e.g. laser scanned) and globally registered to geo-referenced coordinate system

o Use 3D marker-less tracking to correlate distinctive features to 3D building models. Subsequent tracking using inertial sensors and visual optical flow

Page 15: EU funded FP7: Oct 11 – Sep 14

Terrain 3D Model matching

o Synthetic model of world rendered from Digital Elevation Models. Salient features from camera feed (depth discontinuities) matched to similar synthetic features.

Page 16: EU funded FP7: Oct 11 – Sep 14

16

• Use approximate location to gather nearby images from the cloud

• Exploit sensor data to provide a clue for orientation alignment

• Computer vision algorithms match feature descriptors from the camera feed to similar features in the cloud images

Appearance matching

Page 17: EU funded FP7: Oct 11 – Sep 14

SLAM + MatchingO Simultaneous

Localization And Mapping - build a map of an unknown environment while at the same time navigating the environment using the map.o Mapped environment has no real-world scale nor absolute geo-coordinates. Exploit prior approaches to complete registration.

Page 18: EU funded FP7: Oct 11 – Sep 14

Mobile context understanding

o User/environment context estimation:o PDR enriched with visiono User activity modellingo Sensing geo-objectso Harvest/create geo-social

content

Page 19: EU funded FP7: Oct 11 – Sep 14

Context sensitive AR delivery

o Inject AR data in a natural manner according to:o environmento occlusionso lighting and shadowso user activity

o Exploit user and environment ‘context’ to select best delivery modality (text, graphics, audio, etc.), i.e. scalable/simplify-able audio-visual content

Page 20: EU funded FP7: Oct 11 – Sep 14

User Interactionso Explore evolving AR delivery

and interactiono In-air interfaces: device,

hand and face trackingo 3D audioo Pico-projection for multi-

user, social-ARo HMDs

Page 21: EU funded FP7: Oct 11 – Sep 14

PrototypesOne consolidated prototype at the end of each year to be evaluated through Use-cases

o Gaming - VeDi 1.0

o Blind assistant - VeDi 2.0

o Tourism - VeDi 3.0

Cons

train

ts

rela

xed

Page 22: EU funded FP7: Oct 11 – Sep 14

VeDi 1.0Objective: Stimulate software and hardware cross-partner integration and showcase state-of-the-art indoor AR registrationScenario:Multi-player, table-top AR Game resembling a miniature city. Players must accomplish a set of AR missions in the city, that adhere to physical constraints.Software: Sensor-aided marker-less 3D feature tracking. City geometrically reconstructed offline correctly occlusion handling and model registration. Hardware:Demo runs on experimental ST Ericsson prototype mobile platform.

Page 23: EU funded FP7: Oct 11 – Sep 14

FP7-ICT-2011-1.5 Networked Media and Search Systems End-to-end Immersive and Interactive Media Technologies

“creating a pervasive Augmented Reality paradigm, where information is presented in a ‘user’ rather than a ‘device’ centric way”

Co-ordinated by Paul Chippendale, Fondazione Bruno Kessler

https://venturi.fbk.eu