2013 lecture 8: mobile ar

Post on 28-Jan-2015

115 Views

Category:

Technology

3 Downloads

Preview:

Click to see full reader

DESCRIPTION

A lecture on Mobile Augmented Reality. A lecture given by Mark Billinghurst at the University of Canterbury on Friday September 13th 2013. This is part of the COSC 426 graduate course on Augmented Reality.

TRANSCRIPT

COSC 426: Augmented Reality

Mark Billinghurst

mark.billinghurst@hitlabnz.org

Sept 13th 2013

Lecture 8: Mobile AR

1983 – Star Wars

1999 - HIT Lab US

CPU: 300 Mhz HDD; 9GB RAM: 512 mb Camera: VGA 30fps Graphics: 500K poly/sec

1998: SGI O2 2008: Nokia N95

CPU: 332 Mhz HDD; 8GB RAM: 128 mb Camera: VGA 30 fps Graphics: 2m poly/sec

Mobile Phone AR   Mobile Phones

  camera   processor   display

  AR on Mobile Phones   Simple graphics  Optimized computer vision  Collaborative Interaction

2005: Collaborative AR

  AR Tennis   Shared AR content   Two user game   Audio + haptic feedback   Bluetooth networking

Mobile AR History

Evolution of Mobile AR

Wearable AR

Handheld AR Displays

Camera phone

1995 1997 2001 2003 2004

Camera phone - Self contained AR

Wearable Computers

PDAs -Thin client AR

PDAs -Self contained AR

Camera phone - Thin client AR

Handheld Displays Tethered Applications

  Fitzmaurice Chameleon (1994)   Rekimoto’s Transvision (1995)

  Tethered LCD   PC Processing and Tracking

Handheld AR Display - Tethered

1995, 1996 Handheld AR   ARPad, Cameleon   Rekimoto’s NaviCam, Transvision   Tethered LCD   PC Processing and Tracking

AR Pad (Mogilev 2002) Handheld AR Display

  LCD screen   Camera   SpaceOrb 3 DOF controller   Peripheral awareness   Viewpoint awareness

Mobile AR: Touring Machine (1997)   University of Columbia

  Feiner, MacIntyre, Höllerer, Webster

  Combines   See through head mounted display   GPS tracking   Orientation sensor   Backpack PC (custom)   Tablet input

MARS View

  Virtual tags overlaid on the real world   “Information in place”

Backpack/Wearable AR

1997 Backpack AR   Feiner’s Touring Machine   AR Quake (Thomas)   Tinmith (Piekarski)   MCAR (Reitmayr)   Bulky, HMD based

PCI 3D Graphics Board

Hard Drive

Serial

Ports

CPU

PC104 Sound Card

PC104 PCMCIA

GPS Antenna

Tracker Controller

DC to DC Converter

Battery

Wearable Computer

GPS RTK correction

Radio

Example self-built working solution with PCI-based 3D graphics

Columbia Touring Machine

Mobile AR - Hardware

  1997 Philip Kahn invents camera phone   1999 First commercial camera phone

Sharp J-SH04

Millions of Camera Phones

Handheld AR – Thin Client

2001 BatPortal (AT&T Cambridge)   PDA used as I/O device   Wireless connection to workstation   Room-scale ultrasonic tracking (Bat)

2001 AR-PDA (C Lab)   PDA thin graphics client   Remote image processing   www.ar-pda.com

2003 ARphone (Univ. of Sydney)   Transfer images via Bluetooth (slow – 30 sec/image)   Remote processing – AR Server

   

Mobile Phone AR – Thin Client

Early Phone Computer Vision Apps 2003 – Mozzies Game - Best mobile game Optical motion flow detecting phone orientation Siemens SX1 – Symbian, 120Mhz, VGA Camera

2005 – Marble Revolution (Bit-Side GmbH) Winner of Nokia's Series 60 Challenge 2005

2005 – SymBall (VTT)

Computer Vision on Mobile Phone   Cameras and Phone CPU sufficient for computer vision

applications   Pattern Recognition (Static Processing)

  QR Code   Shotcode (http://www.shotcode.com/)

  Motion Flow (2D Image Processing)   GestureTek

-  http://www.gesturetekmobile.com/

  TinyMotion

  3D Pose Calculation   Augmented Reality

Handheld AR – Self Contained 2003 PDA-based AR

  ARToolKit port to PDA   Studierstube ported to PDA   AR Kanji Educational App.   Mr Virtuoso AR character   Wagner’s Invisible Train

-  Collaborative AR

Mobile Phone AR – Self Contained 2004 Mobile Phone AR

  Moehring, Bimber   Henrysson (ARToolKit)   Camera, processor, display together

AR Advertising

  Txt message to download AR application (200K)   See virtual content popping out of real paper advert   Tested May 2007 by Saatchi and Saatchi

2008 - Location Aware Phones

Nokia Navigator Motorola Droid

Real World Information Overlay   Tag real world locations

 GPS + Compass input  Overlay graphics data on live video

  Applications   Travel guide, Advertising, etc

  Eg: Mobilizy Wikitude (www.mobilizy.com)

  Android based, Public API released

  Other companies   Layar, AcrossAir, Tochnidot, RobotVision, etc

Layar – www.layar.com

HIT Lab NZ Android AR Platform   Architectural Application   Loads 3D models

  a OBJ/MTL format

  Positions content in space  GPS, compass

  Intuitive user interface   toolkit to modify the model

  Connects to back end model database

Architecture

Android application Database server

Postgres

Web Interface

Add models

Web application java and php server

1995 Handheld Display: NaviCam, AR-PAD, Transvision

1997 Wearable AR: Touring Machine, AR Quake

2001 Handheld AR – Thin Client: AR-PDA, Bat Portal

2003 Handheld AR – Self contained: Invisible Train

2003 Mobile Phone – 2D Vision: Mozzies, Symball

2003 Mobile Phone – Thin Client: ARphone

2004 Mobile Phone – Self contained: Moehring, Symbian

History of Handheld and Mobile AR

Mobile AR by Weight

Backpack+HMD: …5-8kg

Scale it down: Vesp‘R [Kruijff ISMAR07]: …Sony UMPC 1.1GHz …1.5kg …still >$5K

Scale it down more: Smartphone…$500 …All-in-one …0.1kg …billions of units

1996

2003 2007

2013 State of the Art Handheld Hardware available

PDA, mobile phones, external cameras Sensors: GPS, accelerometer, compass Software Tools are Available

Tracking: ARToolKitPlus, stbTracker, Vuforia Graphics: OpenGL ES Authoring: Layar, Wikitude, Metaio Creator

What is needed: High level authoring tools Content development tools Novel interaction techniques User evaluation and usability

Mobile AR Companies   Mobile AR

  GPS + compass

  Many Companies   Layar   Wikitude   Acrossair   PressLite   Yelp   Robot vision   Etc..

$2 million USD in 2010 $732 million USD in 2014

Qualcomm

  Acquired Imagination   October 2010 - Releases free Android AR SDK   Computer vision tracking - marker, markerless   Integrated with Unity 3D renderer   http://developer.qualcomm.com/ar

Rock-em Sock-em

  Shared AR Demo   Markerless tracking

Mobile AR Technology

Technology Components   Software Platform

  Eg studierStube platform

  Developer Tools   iOS, Android

  Tracking Technology   Computer vision, sensor based

  Mobile Graphics   OpenGL ES

  Interaction Methods   Handheld Interaction

iPhone 4   Apple iOS   Faster CPU (1.2GHz)   High screen resolution

  3.5”, 960x640

  camera API   Multi-touch   Hardware 3D   GPS, compass, accelerometer

and gyroscope

Hardware Sensors   Camera (resolution, fps)

 Maker based/markerless tracking   Video overlap

  GPS (resolution, update rate)  Outdoor location

  Compass   Indoor/outdoor orientation

  Accelerometer  Motion sensing, relative tilt

Studierstube ES Framework

  Typical AR application framework

  Developed at TU Graz

Hardware

OS/Low Level API

Programming Libraries

End User Application

The Studierstube ES framework

Tracking

Platform

Graphics

Content

User Interface - Application

Mobile Graphics

Computer Graphics on Mobile Phones   Small screen, limited input options   Limited support for accelerated graphics

  Most phones have no GPU

  Mobile Graphics Libraries   OpenGL ES (1.0, 2.0)

-  Cross platform, subset of OpenGL -  C/C++ low level library

  Java M3G -  Mobile 3D graphics API for J2ME platform -  Object importer, scene graph library -  Support from all major phone manufacturers

OpenGL ES   Small-footprint subset of OpenGL

  OpenGL is too large for embedded devices!

  Powerful, low-level API, full functionality for 3D games   Can do almost everything OpenGL can   Available on all key platforms   Software and hardware implementations available

  Fully extensible   Extensions like in OpenGL

OpenGL ES on mobile devices

Versions   Two major tracks

  Not compatible, parallel rather than competitive

  OpenGL ES 1.x   Fixed function pipeline   Suitable for software implementations   All 1.x are backwards compatible

  OpenGL ES 2.x   Vertex and pixel shaders using GLSL ES   All 2.x will be backwards compatible

Fixed Function (1.x)

h"p://www.khronos.org/opengles/2_X/  

Programmable (2.x)

h"p://www.khronos.org/opengles/2_X/  

OpenGL ES 1.x vs 2.0

Tracking

Mobile Augmented Reality’s goal

Create an affordable, massively multi-user, widespread platform

© Tinmith, U. of South Australia

How to not do it…

Ka-Ping Yee: Peephole Displays, CHI’03

Tracking on mobile phones   Vision-based tracking

  Marker-based tracking   Model-based natural feature tracking   Natural feature tracking in unknown

environments

  Sensor tracking   GPS, inertial compass, gyroscope

Backpack-based

Höllerer et al. (1999), Piekarski & Thomas al. (2001), Reitmayr & Schmalstieg (2003)

  Laptop, HMD   Enhanced GPS (DGPS / RTK) + inertial sensor for viewpoint tracking   Hand tracking w/ fiducial markers

Tracking for Handheld AR SLIDE 61

Backpack-based 2.

Kalkusch et al., 2002   Video see-through HMD w/ camera   Viewpoint Tracking w/ inside-out computer vision using markers   ARToolKit markers on walls installed and surveyed manually

Tablet PC / UMPC-based

Schall et al., 2006   Hybrid tracking on UMPC

  Camera fiducial marker tracking   When no marker in view inertial sensor + UWB tracking

PDA-based 1.

BatPortal (Newman et al., 2001)   PDA as thin client (rendering &

tracking on server + VNC)   Ultrasonic tracking

SHEEP (MacWilliams et al., 2003)

  Tracking by ART (external IR cameras + retroreflective target)

  Projection-based AR environ.

non-AR Tracking on Phones

AR-PDA (2003)

  Model-based tracking   PDA = thin client

tracking on server   Not real-time

Mosquito Hunt (2003) Marble Revolution (2004) Pingis (VTT, 2006) Game control w/ optical flow techniques

TinyMotion (2006) GUI control & input

on cell phones w/ image differencing

& block correlation

History of AR Tracking on Phones (1)

  2003   ARToolKit on PDA   Wagner et at.

  2004   3D Marker on Phone   Möhring et al.

  2005   ARToolKit on Symbian   Henrysson et al.

Tracking for Handheld AR SLIDE 66

Fiducial marker tracking on handhelds

Möhring et al., 2004 Henrysson et al., 2006 Wagner et al., 2003

Rohs, 2006 Bucolo et al., 2005

History of AR Tracking on Phones (2)

  2005   Visual Codes   Rohs et at.

  2008   Advanced Marker Tracking   Wagner et al.

  2008   Natural Feature Tracking   Wagner et al.

What can we do on today‘s mobile phones?

  Typical specs   600+ MHz   ~5MB of available RAM   160x120 - 320x240 at 15-30 Hz camera

  Possible to do   Marker tracking in 5-15ms   Natural feature tracking in 20-50ms

Other Mobile Sensors   Orientation

  Compass

  Relative movement/rotation   Accelerometer, gyroscope

  Context   Audio, light sensor, proximity

  Location   GPS, A-GPS, Wifi positioning, Cell tower triangulation

Handheld AR Interfaces

Handheld HCI

 Consider your user   Follow good HCI principles  Adapt HCI guidelines for handhelds  Design to device constraints  Rapid prototyping  User evaluation

Sample Handheld AR Interfaces   Clean   Large Video View   Large Icons   Text Overlay

Handheld Display vs Fixed Display

  Experiment comparing handheld moving, to handheld button input, small fixed display, desktop display, large plasma

  Users performed (1) navigation task, (2) selection task   Moving handheld display provided greater perceived FOV, higher degree of

presence, faster completion time

J. Hwang, J. Jung, G. Kim. Hand-held Virtual Reality: A Feasibility Study. In proceedings of VRST 2006

Search Task Completion Time

FOV, Presence and Immersion

HandHeld AR Wearable AR

Output: Display

Input

Input & Output

HMD vs Handheld AR Interface

Handheld Interface Properties   Handheld interface vs. HMD interface

  Display is handheld rather than headworn   Much greater peripheral view of real world   Display and input device connected   Can move device independent of view

Phone Keypad Touch Screen One handed input Keypad only Bimanual interaction

Object based interaction

Two handed input Stylus/touch screen Screen based input/selection

Large screen Limited number of buttons

Handheld Interface Metaphors   Tangible AR Lens Viewing

  Look through screen into AR scene   Interact with screen to interact with AR

content -  Eg Invisible Train

  Tangible AR Lens Manipulation   Select AR object and attach to device   Use the motion of the device as input

-  Eg AR Lego

Translation Study Conditions

A: Object fixed to the phone (one handed) B: Button and keypad input C: Object fixed to the phone (bimanual) - one hand for rotating tracking pattern

Results – Translation •  9 subjects – within subject design •  Timing

•  Tangible fastest •  twice as fast as keypad

•  Survey •  Tangible easiest (Q1) •  Keypad most accurate (Q2) •  Tangible quickest (Q3) •  Tangible most enjoyable (Q4)

•  Ranking •  Tangible favored

A B C

Rank 1.44 2.56 2.0

Conditions A: Arcball B: Keypad input for rotation about

the object axis C: Object fixed to the phone (one handed)

D: Object fixed to the phone (bimanual)

Rotation Study

•  Timing •  Keypad(B) and Arcball(A) fastest

•  No significant survey

differences

A B C D Rank 3.0 2.3 2.4 2.2

Results – Rotation

Collaborative AR

  AR Tennis   Virtual tennis court   Two user game   Audio + haptic feedback   Bluetooth messaging

TAT Augmented ID

Design Guidelines

Apply handheld HCI guidelines for on-screen content - large buttons, little text input, etc

Design physical + virtual interface elements Pick appropriate interface metaphor

- “handheld lens” approach using handheld motion - Tangible AR for AR overlay

Build prototypes Continuously evaluate application

Mobile AR Browsing

AR Browsers   AR equivalent of web browser

  Request and serve up content

  Commercial outdoor AR applications   Junaio, Layar, Wikitude, etc

  All have their own language specifications  Wikitude – ARML   Junaio – XML, AREL

AR Browsers   Commercial outdoor AR applications

  Junaio, Layar, Wikitude, etc

  All have their own language specifications   Wikitude – ARML   Junaio - XML

  Need for common standard   Based on existing standards for geo-located content etc   Support for dynamic/interactive content   Easier to author mobile AR applications   Easy to render on AR browsers

Architecture

Common AR Browsers   Layar

  http://www.layar.com/

  Wikitude   http://www.wikitude.com/

  Junaio   http://www.junaio.com

  TagWhat   http://www.tagwhat.com/

  Sekai Camera   http://sekaicamera.com/

Nokia City Lens

  More recent AR Browser

Junaio - www.junaio.com

Key Features   Content provided in information channels

 Over 2,000 channels available

  Two types of AR channels  GLUE channels – visual tracking   Location based channels – GPS, compass tracking

  Simple to use interface with multiple views   List, map, AR (live) view

  Point of Interest (POI) based   POIs are geo-located content

QR Code Launch

Glue Tracking - Markerless

  Search for “instant tracker”

Junaio Interface

Interface

  List View, Map View, AR View

Back-end Servers

Data Flow

Search.php <?xml version=\"1.0\" encoding=\"UTF-8\"?> <results>

<poi id=\"1\" interactionfeedback=\"none\"> <name><![CDATA[[Hotel Hello World]]]></name> <description><![CDATA[[This is a beautiful, family hotel and restaurant, just around the

corner. Special Dinner and Rooms available.]]]></description>

<l>37.776685,-122.422771,0</l> <mime-type>text/plain</mime-type>

<icon>http://dev.junaio.com/publisherDownload/tutorial/icon_map.png</icon> <thumbnail>http://dev.junaio.com/publisherDownload/tutorial/thumb.jpg</thumbnail> <phone>555/1234567</phone> <homepage> http://www.hotelaroundthecorner.com </homepage> </poi>

</results>

AR Outcome

Limitations of Plain XML   No interactivity

 Only simple pop-ups

  No user interface Customizations  Can only use Junaio GUI elements

  No local interactivity   Always needs remote server connection

Junaio AREL

AREL   Augmented Reality Environment Language

  Overcomes limitations of XML by itself   Based on web technologies; XML, HTML5, JavaScript

  Core Components 1.  AREL XML: Static file, specifies scene content 2.  AREL JavaScript: Handles all interactions and animation. Any

user interaction send an event to AREL JS 3.  AREL HTML5: GUI Elements. Buttons, icons, etc

  Advantages   Scripting on device, more functionality, GUI customization

Adding Interactivity

Basic Interactivity

  Add a button on screen to move virtual character

  Use the following  HTML: button specification   Javascript: Interaction   PHP/XML: 3D model

  Junaio Tutorial 5   http://www.junaio.com/develop/quickstart/

advanced-interactions-and-location-based-model-3ds/

Server File Structure

HTML – GUI

JavaScript - interactivity

Main Index

PHP - content

search.php – specify Lego Man if(!empty($_GET['l'])) $position = explode(",", $_GET['l']); … //return the lego man $oLegoMan = ArelXMLHelper::createLocationBasedModel3D(

"1", // id "lego man", //title WWW_ROOT . "/resources/walking_model3_7fps.md2", // mainresource WWW_ROOT . "/resources/walking_model.png", // resource $position, // location array(0.2, 0.2, 0.2), // scale new ArelRotation(ArelRotation::ROTATION_EULERRAD, array(1.57,0,1.57)) // rotation

); …

Use local position

Lego model and texture

styles.css – HTML GUI #buttons { position: absolute; bottom: 44px; right: 44px; }

.ipad div { width: 104px; height: 106px; }

#buttons div { background-image: url("../images/button.png"); background-repeat: no-repeat; background-size: 100%;

float: left; }

Button location

Button style

Logic_LBS5.js - JavaScript   Create an event listener

  setEventListener();

  Add functionality to model object   Load model from scene   Adding model behaviours

  Add functionality to GUI objects  Define the event listener   Bind model behaviours to GUI objects

Result

Authoring Tools

Metaio Creator

  Drag and drop Junaio authoring

BuildAR – buildar.com

Sony CSL © 2004

Developing Mobile AR Experiences

top related