2013 lecture 5: ar tools and interaction

150
COSC 426: Augmented Reality Mark Billinghurst [email protected] August 9th 2013 Lecture 5: AR Tools and Interaction

Upload: mark-billinghurst

Post on 28-Jan-2015

111 views

Category:

Technology


5 download

DESCRIPTION

Lecture 5 from the COSC 426 Graduate course on Augmented Reality. This lecture talks about AR development tools and interaction styles. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury. August 9th 2013

TRANSCRIPT

Page 1: 2013 Lecture 5: AR Tools and Interaction

COSC 426: Augmented Reality

Mark Billinghurst

[email protected]

August 9th 2013

Lecture 5: AR Tools and Interaction

Page 2: 2013 Lecture 5: AR Tools and Interaction

Interaction Design Process

Page 3: 2013 Lecture 5: AR Tools and Interaction

Data Gathering Techniques (1)

  Questionnaires   Looking for specific information   Qualitative and quantitative results   Good for getting data from a large, dispersed group

  Interviews   Good for exploring issues, using props   Structured, unstructured or semi-structured   But are time consuming and difficult to visit everyone

Page 4: 2013 Lecture 5: AR Tools and Interaction

Data Gathering Techniques (2)

  Workshops or focus groups   Group interviews/activities   Good at gaining a consensus view and/or highlighting areas

of conflict   Observations   Spending time with users in day to day tasks   Good for understanding task context   requires time and can result in a huge amount of data

  Documentation   Procedures and rules written down in manuals

Page 5: 2013 Lecture 5: AR Tools and Interaction

Elaboration and Reduction   Elaborate - generate solutions. These are the opportunities   Reduce - decide on the ones worth pursuing   Repeat - elaborate and reduce again on those solutions

Source: Laseau,P. (1980) Graphic Thinking for Architects & Designers. John Wiley and Sons

Page 6: 2013 Lecture 5: AR Tools and Interaction

Tools for Effective Design

 Personas  Scenarios  Storyboards  Wireframes and Mock-ups  Prototypes

Page 7: 2013 Lecture 5: AR Tools and Interaction

Persona Technique •  Personas are a design tool to help visualize who you are

designing for and imagine how person will use the product •  A persona is an archetype that represents the behavior and

goals of a group of users •  Based on insights and observations from customer research •  Not real people, but synthesised from real user characteristics •  Bring them to life with a name, characteristics, goals, background •  Develop multiple personas

Page 8: 2013 Lecture 5: AR Tools and Interaction

Gunther the Ad Guy

Gunther is from Germany. He Travels extensively for work and As he is an advertising executive he needs to present concepts to clients quickly and easily. He is a person very well-versed in new technologies and wishes he had easier portable solutions for his presentations…..

Page 9: 2013 Lecture 5: AR Tools and Interaction

Scenarios Usage Scenarios are narrative descriptions of how the product

meets the needs of a persona

Short (2 pages max) Focus on unmet needs of persona Concrete story Set of stories around essential tasks, problems... Use to test ideas

Page 10: 2013 Lecture 5: AR Tools and Interaction

Storyboarding

Sequence of sketches showing use of system in everyday use context

Concrete example Easier (faster) to grasp than text based stories Means of communication with users and system

developers Sketches, not drawings... Use to test interaction and make sure design works

Page 11: 2013 Lecture 5: AR Tools and Interaction

Sketching is about design Sketching is not about drawing It is about design.

Sketching is a tool to help you: -  express -  develop, and -  communicate design ideas

Sketching is part of a process: -  idea generation, -  design elaboration -  design choices, -  engineering

Page 12: 2013 Lecture 5: AR Tools and Interaction

Sketch vs. Prototype Sketch   Prototype  

Invite   A)end  

Suggest   Describe  

Explore   Refine  

Ques;on   Answer  

Propose   Test  

Provoke   Resolve  

Tenta;ve,  non  commi)al   Specific  Depic;on  

The primary differences are in the intent

Page 13: 2013 Lecture 5: AR Tools and Interaction

Types of Prototypes

Low Fidelity – quick and dirty, easy access materials like cardboard and paper.

High Fidelity – more involved electronic versions similar in materials to final product.

Page 14: 2013 Lecture 5: AR Tools and Interaction

RAPID Prototyping

  Fast and inexpensive   Identifies problems before they’re coded   Elicits more and better feedback from users   Helps developers think creatively   Gets users and other stakeholders involved early

  Fosters teamwork and communication   Avoids opinion wars   Helps decide design directions

Page 15: 2013 Lecture 5: AR Tools and Interaction

Paper Prototyping (Low Fidelity)

Quick and simple means of sketching interfaces Use office materials Easier to criticize, quick to change Creative process (develop in team) Can also use for usability test (focus on flow of interaction rather than visuals) Used a lot to test out concepts before real design begins.

Page 16: 2013 Lecture 5: AR Tools and Interaction

Paper Prototyping

Page 17: 2013 Lecture 5: AR Tools and Interaction

High-fidelity prototyping •  Uses materials that you would expect to be in the

final product. •  Prototype looks more like the final system than a

low-fidelity version. •  For a high-fidelity software prototype common

environments include Macromedia Director, Visual Basic, and Smalltalk.

•  Danger that users think they have a full system…….see compromises

Page 18: 2013 Lecture 5: AR Tools and Interaction

Rapid Prototyping

  Speed development time with quick hardware mockups   handheld device connected to PC   LCD screen, USB phone keypad, Camera

  Can use PC development tools for rapid development   Flash, Visual Basic, etc

Page 19: 2013 Lecture 5: AR Tools and Interaction

AR Tools

Page 20: 2013 Lecture 5: AR Tools and Interaction

experiences

applications

tools

components

Sony CSL © 2004

Building Compelling AR Experiences

Tracking, Display

Authoring

Page 21: 2013 Lecture 5: AR Tools and Interaction

AR Authoring Tools   Low Level Software Libraries

  osgART, Studierstube, MXRToolKit   Plug-ins to existing software

  DART (Macromedia Director), mARx, Unity,   Stand Alone

  AMIRE, BuildAR, Metaio Creator etc   Rapid Prototyping Tools

  Flash, OpenFrameworks, Processing, Arduino, etc   Next Generation

  iaTAR (Tangible AR)

Page 22: 2013 Lecture 5: AR Tools and Interaction

ARToolKit (Kato 1998)

  Open source – computer vision based AR tracking   http://artoolkit.sourceforge.net/

Page 23: 2013 Lecture 5: AR Tools and Interaction

ARToolKit Structure

  Three key libraries:   AR32.lib – ARToolKit image processing functions   ARgsub32.lib – ARToolKit graphics functions   ARvideo.lib – DirectShow video capture class

DirectShow

ARvideo.lib

Page 24: 2013 Lecture 5: AR Tools and Interaction

Software

  Cross platform  Windows, Mac, Linux, IRIX, Symbian, iPhone, etc

  Additional basic libraries   Video capture library (Video4Linux, VisionSDK)  OpenGL, GLUT

  Requires a rendering library  Open VRML, Open Inventor, osgART, etc

Page 25: 2013 Lecture 5: AR Tools and Interaction

Additional Software

  ARToolKit just provides tracking   For an AR application you’ll need more software

  High level rendering library  Open VRML, Open Inventor, osgART, etc

  Audio Library   Fmod, etc

  Peripheral support

Page 26: 2013 Lecture 5: AR Tools and Interaction

What does ARToolKit Calculate?   Position of makers in the camera coordinates   Pose of markers in the camera coordinates   Output format

  3x4 matrix format to represent the transformation matrix from the marker coordinates to the camera coordinates

Page 27: 2013 Lecture 5: AR Tools and Interaction

Coordinate Systems

Page 28: 2013 Lecture 5: AR Tools and Interaction

From Marker To Camera   Rotation & Translation

TCM : 4x4 transformation matrix from marker coord. to camera coord.

Page 29: 2013 Lecture 5: AR Tools and Interaction

An ARToolKit Application   Initialization

  Load camera and pattern parameters   Main Loop

  Step1. Image capture and display   Step2. Marker detection   Step3. Marker identification   Step4. Getting pose information   Step5. Object Interactions/Simulation   Step6. Display virtual objects

  End Application   Camera shut down

Page 30: 2013 Lecture 5: AR Tools and Interaction

Sample1.c Main Function main()!{!!init();!!argMainLoop( mouseEvent, !

! !keyEvent, mainLoop); !}!

Page 31: 2013 Lecture 5: AR Tools and Interaction

Sample1.c - mainLoop Function if( dataPtr = (ARUint8 *) arVideoGetImage()) == NULL ) {

arUtilSleep(2); return;

} argDrawMode2D(); argDispImage(dataPtr, 0, 0 ); arVideoCapNext(); argSwapBuffers();

Page 32: 2013 Lecture 5: AR Tools and Interaction

Ex. 2: Detecting a Marker   Program : sample2.c   Key points

  Threshold value   Important external variables   arDebug – keep thresholded image   arImage – pointer for thresholded image   arImageProcMode – use 50% image for image

processing -  AR_IMAGE_PROC_IN_FULL -  AR_IMAGE_PROC_IN_HALF

Page 33: 2013 Lecture 5: AR Tools and Interaction

Sample2.c – marker detection

/* detect the markers in the video frame */ if(arDetectMarker(dataPtr, thresh, &marker_info, &marker_num) < 0 ) { cleanup(); exit(0); } for( i = 0; i < marker_num; i++ ) { argDrawSquare(marker_info[i].vertex,0,0); }

Page 34: 2013 Lecture 5: AR Tools and Interaction

Making a pattern template   Use of utility program:

mk_patt.exe   Show the pattern   Put the corner of red line

segments on the left-top vertex of the marker

  Pattern stored as a template in a file

  1:2:1 ratio determines the pattern region used

Page 35: 2013 Lecture 5: AR Tools and Interaction

Ex. 4 – Getting 3D information

 Program : sample4.c  Key points

 Definition of a real marker  Transformation matrix

- Rotation component - Translation component

Page 36: 2013 Lecture 5: AR Tools and Interaction

Sample4.c – Transformation matrix

double marker_center[2] = {0.0, 0.0}; double marker_width = 80.0; double marker_trans[3][4];

arGetTransMat(&marker_info[i], marker_center, marker_width, marker_trans);

Page 37: 2013 Lecture 5: AR Tools and Interaction

Finding the Camera Position This function sets transformation matrix from marker

to camera into marker_trans[3][4]." arGetTransMat(&marker_info[k], marker_center,

marker_width, marker_trans);

You can see the position information in the values of marker_trans[3][4]."" Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3];

Page 38: 2013 Lecture 5: AR Tools and Interaction

ARToolKit Coordinate Frame

Page 39: 2013 Lecture 5: AR Tools and Interaction

Ex. 5- Virtual Object Display

  Program : sample5.c   Key points

 OpenGL parameter setting  Setup of projection matrix  Setup of modelview matrix

Page 40: 2013 Lecture 5: AR Tools and Interaction

Appending your own OpenGL code

Set the camera parameters to OpenGL Projection matrix. argDrawMode3D(); argDraw3dCamera( 0, 0 );

Set the transformation matrix from the marker to the camera to the OpenGL ModelView matrix. argConvGlpara(marker_trans, gl_para); glMatrixMode(GL_MODELVIEW); glLoadMatrixd( gl_para );

After calling these functions, your OpenGL objects are drawn in the real marker coordinates.

Page 41: 2013 Lecture 5: AR Tools and Interaction

ARToolKit in the World

  Hundreds of projects   Large research community

Page 42: 2013 Lecture 5: AR Tools and Interaction

ARToolKit Family

ARToolKit ARToolKit NFT

ARToolKit (Symbian)

NyToolKit - Java, C#, - Android, WM

JARToolKit (Java)

FLARToolKit (Flash)

FLARManager (Flash)

Page 43: 2013 Lecture 5: AR Tools and Interaction

FLARToolKit   Flash AS3 Version of the ARToolKit

(was ported from NyARToolkit the Java Version of the ARToolkit)

  enables augmented reality in the Browser   uses Papervision3D for as 3D Engine   available at http://saqoosha.net/   dual license, GPL and commercial license

Page 44: 2013 Lecture 5: AR Tools and Interaction

FLARToolkit

Papervision 3D

Adobe Flash

AR Application Components

Page 45: 2013 Lecture 5: AR Tools and Interaction

private function mainEnter(e:Event):void { /* Capture video frame*/ capture.draw(vid);

/* Detect marker */ if (detector.detectMarkerLite(raster, 80) && detector.getConfidence() > 0.5) { //Get the transfomration matrix for the current marker position detector.getTransformMatrix(trans);

//Translates and rotates the mainContainer so it looks right mainContainer.setTransformMatrix(trans);

//Render the papervision scene renderer.render(); } }

Page 46: 2013 Lecture 5: AR Tools and Interaction

FLARToolKit Examples

Page 47: 2013 Lecture 5: AR Tools and Interaction

Papervision 3D   http://www.papervision3d.org/   Flash-based 3D-Engine   Supports

  import of 3D Models   texturing   animation   scene graph

  alternatives: Away3d, Sandy,…

Page 48: 2013 Lecture 5: AR Tools and Interaction

Source Packages   „Original“ FLARToolkit (Libspark, Saqoosha) (

http://www.libspark.org/svn/as3/FLARToolKit/trunk/ )

  Start-up-guides   Saqoosha (http://saqoosha.net/en/flartoolkit/start-up-guide/ )

 Miko Haapoja (http://www.mikkoh.com/blog/?p=182 )

  „Frameworks“   Squidder MultipleMarker – Example (

http://www.squidder.com/2009/03/06/flar-how-to-multiple-instances-of-multiple-markers/ )

  FLARManager (http://words.transmote.com/wp/flarmanager/ )

Page 49: 2013 Lecture 5: AR Tools and Interaction

Other Languages   NyARToolKit

  http://nyatla.jp/nyartoolkit/wp/   AS3, Java, C#, Processing, Unity, etc

  openFrameworks   http://www.openframeworks.cc/   https://sites.google.com/site/ofauckland/examples/8-artoolkit-example

  Support for other libraries -  Kinect, Audio, Physics, etc

Page 50: 2013 Lecture 5: AR Tools and Interaction

void testApp::update(){ //capture video and detect markers mov.update(); if (mov.isFrameNew()) { img.setFromPixels(mov.getPixels(), ofGetWidth(), ofGetHeight()); gray = img; tracker.setFromPixels(gray.getPixels()); } } //-------------------------------------------------------------- void testApp::draw(){ //draw AR objects ofSetColor(0xffffff); mov.draw(0, 0); for (int i=0; i<tracker.markers.size(); i++) { ARMarkerInfo &m = tracker.markers[i]; tracker.loadMarkerModelViewMatrix(m); ofSetColor(255, 255, 0, 100); ofCircle(0,0,25); ofSetColor(0); ofDrawBitmapString(ofToString(m.id),0,0); } }

Page 51: 2013 Lecture 5: AR Tools and Interaction

Low Level Mobile AR Tools   Vuforia Tracking Library (Qualcomm)

  Vuforia.com   iOS, Android  Computer vision based tracking  Marker tracking, 3D objects, frame

markers   Integration with Unity

  Interaction, model loading

Page 52: 2013 Lecture 5: AR Tools and Interaction

OSGART Programming Library   Integration of ARToolKit with a High-Level

Rendering Engine (OpenSceneGraph) OSGART= OpenSceneGraph + ARToolKit

  Supporting Geometric + Photometric Registration

Page 53: 2013 Lecture 5: AR Tools and Interaction

osgART:Features

  C++ (but also Python, Lua, etc).   Multiple Video Input supports:

 Direct (Firewire/USB Camera), Files, Network by ARvideo, PtGrey, CVCam, VideoWrapper, etc.

  Benefits of Open Scene Graph  Rendering Engine, Plug-ins, etc

Page 54: 2013 Lecture 5: AR Tools and Interaction

mARx Plug-in

  3D Studio Max Plug-in   Can model and view AR content at the same time

Page 55: 2013 Lecture 5: AR Tools and Interaction

BuildAR

  http://www.buildar.co.nz/   Stand alone application   Visual interface for AR model viewing application   Enables non-programmers to build AR scenes

Page 56: 2013 Lecture 5: AR Tools and Interaction

Metaio Creator

  Drag and drop AR   http://www.metaio.com/creator/

Page 57: 2013 Lecture 5: AR Tools and Interaction

Total Immersion D’Fusion Studio   Complete commercial authoring platform

  http://www.t-immersion.com/  Multi-platform  Markerless tracking   Scripting   Face tracking   Finger tracking   Kinect support

Page 58: 2013 Lecture 5: AR Tools and Interaction

Others   AR-Media

  http://www.inglobetechnologies.com/  Google sketch-up plug-in

  LinceoVR   http://linceovr.seac02.it/   AR/VR authoring package

  Libraries   JARToolKit, MXRToolKit, ARLib, Goblin XNA

Page 59: 2013 Lecture 5: AR Tools and Interaction

More Libraries

  JARToolKit   MRToolKit, MXRToolKit, ARLib, OpenVIDIA   DWARF, Goblin XNA   AMIRE   D’Fusion

Page 60: 2013 Lecture 5: AR Tools and Interaction

Advanced Authoring: iaTAR (Lee 2004)

  Immersive AR Authoring   Using real objects to create AR applications

Page 61: 2013 Lecture 5: AR Tools and Interaction

osgART

Developing Augmented Reality Applications with osgART

Page 62: 2013 Lecture 5: AR Tools and Interaction

What is a Scene Graph?   Tree-like structure for organising a virtual world

  e.g. VRML

  Hierarchy of nodes that define:   Groups (and Switches, Sequences etc…)   Transformations   Projections   Geometry   …

  And states and attributes that define:   Materials and textures   Lighting and blending   …

Page 63: 2013 Lecture 5: AR Tools and Interaction

Scene Graph Example

Page 64: 2013 Lecture 5: AR Tools and Interaction

Benefits of a Scene Graph   Performance

  Structuring data facilitates optimization

-  Culling, state management, etc…

  Abstraction   Underlying graphics pipeline is

hidden   Low-level programming (“how do I

display this?”) replaced with high-level concepts (“what do I want to display?”)

Image: sgi

Page 65: 2013 Lecture 5: AR Tools and Interaction

About Open Scene Graph   http://www.openscenegraph.org/   Open-source scene graph implementation   Based on OpenGL   Object-oriented C++ following design pattern principles   Used for simulation, games, research, and industrial projects   Active development community

  Maintained by Robert Osfield   ~2000 mailing list subscribers   Documentation project: www.osgbooks.com

  Uses the OSG Public License (similar to LGPL)

Page 66: 2013 Lecture 5: AR Tools and Interaction

About Open Scene Graph (2)

Pirates of the XXI Century Flightgear

3DVRII Research Institute EOR

SCANeR

VRlab Umeå University

Page 67: 2013 Lecture 5: AR Tools and Interaction

Open Scene Graph Features   Plugins for loading and saving

  3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)…   2D: .png, .jpg, .bmp, QuickTime movies

  NodeKits to extend functionality   e.g. osgShadow

  Cross-platform support for:   Window management (osgViewer)   Threading (OpenThreads)

Page 68: 2013 Lecture 5: AR Tools and Interaction

Open Scene Graph Architecture

Plugins read and write 2D image and 3D model files

NodeKits extend core functionality, exposing higher-level node types

Scene graph and rendering

functionality

Inter-operability with other environments,

e.g. Python

Page 69: 2013 Lecture 5: AR Tools and Interaction

Some Open Scene Graph Demos

  You may want to get the OSG data package:   Via SVN: http://www.openscenegraph.org/svn/osg/OpenSceneGraph-Data/trunk

osgviewer osgmotionblur osgparticle

osgreflect osgdistortion osgfxbrowser

Page 70: 2013 Lecture 5: AR Tools and Interaction

Learning OSG   Check out the Quick Start Guide

  Free PDF download at http://osgbooks.com/, Physical copy $13US

  Join the mailing list: http://www.openscenegraph.org/projects/osg/wiki/MailingLists

  Browse the website: http://www.openscenegraph.org/projects/osg   Use the forum: http://forum.openscenegraph.org

  Study the examples   Read the source?

Page 71: 2013 Lecture 5: AR Tools and Interaction

osgART

Page 72: 2013 Lecture 5: AR Tools and Interaction

What is osgART?   osgART adds AR to Open Scene Graph   Further developed and enhanced by:

  Julian Looser  Hartmut Seichter   Raphael Grasset

  Current version 2.0, Open Source   http://www.osgart.org

Page 73: 2013 Lecture 5: AR Tools and Interaction

osgART Approach: Basic Scene Graph

Root

Transform

3D Object

0.988 -0.031 -0.145 0 -0.048 0.857 -0.512 0 0.141 0.513 0.846 0 10.939 29.859 -226.733 1 [ ]

  To add Video see-through AR:   Integrate live video   Apply correct projection matrix   Update tracked transformations in

realtime

Page 74: 2013 Lecture 5: AR Tools and Interaction

osgART Approach: AR Scene Graph Root

Transform

3D Object

Page 75: 2013 Lecture 5: AR Tools and Interaction

osgART Approach: AR Scene Graph

Video Geode

Root

Transform

3D Object

Virtual Camera

Video Layer

Page 76: 2013 Lecture 5: AR Tools and Interaction

osgART Approach: AR Scene Graph

Video Geode

Root

Transform

3D Object

Virtual Camera

Projection matrix from tracker calibration

Transformation matrix updated from marker tracking in realtime Video

Layer Full-screen quad with live texture updated from Video source

Orthographic projection

Page 77: 2013 Lecture 5: AR Tools and Interaction

osgART Approach: AR Scene Graph

Video Geode

Root

Transform

3D Object

Virtual Camera

Projection matrix from tracker calibration

Transformation matrix updated from marker tracking in realtime Video

Layer Full-screen quad with live texture updated from Video source

Orthographic projection

Page 78: 2013 Lecture 5: AR Tools and Interaction

osgART Architecture   Like any video see-through AR library, osgART requires video

input and tracking capabilities.

AR

Library

Application

Video Source e.g. DirectShow

Tracking Module (libAR.lib)

Page 79: 2013 Lecture 5: AR Tools and Interaction

osgART Architecture   osgART uses a plugin architecture so that video sources and tracking

technologies can be plugged in as necessary

osgART

Application

Video Plugin

Tracker Plugin ARToolKit4 -

ARToolkitPlus - MXRToolKit -

ARLib - bazAR (work in progress) - ARTag (work in progress) -

OpenCVVideo - VidCapture - CMU1394 -

PointGrey SDK - VidereDesign -

VideoWrapper - VideoInput -

VideoSource - DSVL -

Intranel RTSP -

Page 80: 2013 Lecture 5: AR Tools and Interaction

Basic osgART Tutorial   Develop a working osgART application from scratch.   Use ARToolKit 2.72 library for tracking and video

capture

Page 81: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 1: Basic OSG Viewer   Start with the standard Open Scene Graph Viewer   We will modify this to do AR!

Page 82: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 1: Basic OSG Viewer   The basic osgViewer… #include <osgViewer/Viewer> #include <osgViewer/ViewerEventHandlers>

int main(int argc, char* argv[]) {

// Create a viewer osgViewer::Viewer viewer;

// Create a root node osg::ref_ptr<osg::Group> root = new osg::Group;

// Attach root node to the viewer viewer.setSceneData(root.get());

// Add relevant event handlers to the viewer viewer.addEventHandler(new osgViewer::StatsHandler); viewer.addEventHandler(new osgViewer::WindowSizeHandler); viewer.addEventHandler(new osgViewer::ThreadingHandler); viewer.addEventHandler(new osgViewer::HelpHandler);

// Run the viewer and exit the program when the viewer is closed return viewer.run();

}

Page 83: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 2: Adding Video   Add a video plugin

  Load, configure, start video capture…

  Add a video background   Create, link to video, add to scene-graph

Page 84: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 2: Adding Video   New code to load and configure a Video Plugin:

// Preload the video and tracker int _video_id = osgART::PluginManager::getInstance()->load("osgart_video_artoolkit2");

// Load a video plugin. osg::ref_ptr<osgART::Video> video = dynamic_cast<osgART::Video*>(osgART::PluginManager::getInstance()->get(_video_id));

// Check if an instance of the video stream could be created if (!video.valid()) { // Without video an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not initialize video plugin!" << std::endl; exit(-1); }

// Open the video. This will not yet start the video stream but will // get information about the format of the video which is essential // for the connected tracker. video->open();

Page 85: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 2: Adding Video   New code to add a live video background

osg::ref_ptr<osg::Group> videoBackground = createImageBackground(video.get()); videoBackground->getOrCreateStateSet()->setRenderBinDetails(0, "RenderBin");

root->addChild(videoBackground.get()); video->start();

osg::Group* createImageBackground(osg::Image* video) { osgART::VideoLayer* _layer = new osgART::VideoLayer(); _layer->setSize(*video); osgART::VideoGeode* _geode = new osgART::VideoGeode(osgART::VideoGeode::USE_TEXTURE_2D, video); addTexturedQuad(*_geode, video->s(), video->t()); _layer->addChild(_geode); return _layer; }

  In the main function…

Page 86: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 3: Tracking   Add a Tracker plugin

  Load, configure, link to video   Add a Marker to track

  Load, activate   Tracked node

 Create, link with marker via tracking callbacks   Print out the tracking data

Page 87: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 3: Tracking int _tracker_id = osgART::PluginManager::getInstance()->load("osgart_tracker_artoolkit2");

osg::ref_ptr<osgART::Tracker> tracker = dynamic_cast<osgART::Tracker*>(osgART::PluginManager::getInstance()->get(_tracker_id));

if (!tracker.valid()) { // Without tracker an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not initialize tracker plugin!" << std::endl; exit(-1); }

// get the tracker calibration object osg::ref_ptr<osgART::Calibration> calibration = tracker->getOrCreateCalibration();

// load a calibration file if (!calibration->load("data/camera_para.dat")) { // the calibration file was non-existing or couldnt be loaded osg::notify(osg::FATAL) << "Non existing or incompatible calibration file" << std::endl; exit(-1); }

// set the image source for the tracker tracker->setImage(video.get());

osgART::TrackerCallback::addOrSet(root.get(), tracker.get());

// create the virtual camera and add it to the scene osg::ref_ptr<osg::Camera> cam = calibration->createCamera(); root->addChild(cam.get());

  Load a tracking plugin and associate it with the video plugin

Page 88: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 3: Tracking

osg::ref_ptr<osgART::Marker> marker = tracker->addMarker("single;data/patt.hiro;80;0;0"); if (!marker.valid()) { // Without marker an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not add marker!" << std::endl; exit(-1); }

marker->setActive(true);

osg::ref_ptr<osg::MatrixTransform> arTransform = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransform.get(), marker.get()); cam->addChild(arTransform.get());

  Load a marker and activate it   Associate it with a transformation node (via event callbacks)   Add the transformation node to the virtual camera node

osgART::addEventCallback(arTransform.get(), new osgART::MarkerDebugCallback(marker.get()));

  Add a debug callback to print out information about the tracked marker

Page 89: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 3: Tracking

  Tracking information is output to console

Page 90: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 4: Adding Content   Now put the tracking data to use!   Add content to the tracked transform   Basic cube code

arTransform->addChild(osgART::testCube()); arTransform->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");

Page 91: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 5: Adding 3D Model   Open Scene Graph can load some 3D formats directly:

  e.g. Wavefront (.obj), OpenFlight (.flt), 3D Studio (.3ds), COLLADA   Others need to be converted

  Support for some formats is much better than others   e.g. OpenFlight good, 3ds hit and miss.

  Recommend native .osg and .ive formats   .osg – ASCII representation of scene graph   .ive – Binary OSG file. Can contain hold textures.

  osgExp : Exporter for 3DS Max is a good choice   http://sourceforge.net/projects/osgmaxexp

  Otherwise .3ds files from TurboSquid can work

Page 92: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 5: Adding 3D Model

std::string filename = "media/hollow_cube.osg"; arTransform->addChild(osgDB::readNodeFile(filename));

  Replace the simple cube with a 3D model   Models are loaded using the osgDB::readNodeFile() function

  Note: Scale is important. Units are in mm.

3D Studio Max

Export to .osg

osgART

Page 93: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 6: Multiple Markers   Repeat the process so far to track more than

one marker simultaneously

Page 94: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 6: Multiple Markers

osg::ref_ptr<osg::MatrixTransform> arTransformA = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransformA.get(), markerA.get()); arTransformA->addChild(osgDB::readNodeFile("media/hitl_logo.osg")); arTransformA->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin"); cam->addChild(arTransformA.get());

osg::ref_ptr<osg::MatrixTransform> arTransformB = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransformB.get(), markerB.get()); arTransformB->addChild(osgDB::readNodeFile("media/gist_logo.osg")); arTransformB->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin"); cam->addChild(arTransformB.get());

  Load and activate two markers

osg::ref_ptr<osgART::Marker> markerA = tracker->addMarker("single;data/patt.hiro;80;0;0"); markerA->setActive(true);

osg::ref_ptr<osgART::Marker> markerB = tracker->addMarker("single;data/patt.kanji;80;0;0"); markerB->setActive(true);

  Create two transformations, attach callbacks, and add models

  Repeat the process so far to track more than one marker

Page 95: 2013 Lecture 5: AR Tools and Interaction

osgART Tutorial 6: Multiple Markers

Page 96: 2013 Lecture 5: AR Tools and Interaction

Basic osgART Tutorial: Summary

Standard OSG Viewer Addition of Video Addition of Tracking

Addition of basic 3D graphics

Addition of 3D Model Multiple Markers

Page 97: 2013 Lecture 5: AR Tools and Interaction

AR Interaction

Page 98: 2013 Lecture 5: AR Tools and Interaction

experiences

applications

tools

components

Sony CSL © 2004

Building Compelling AR Experiences

Tracking, Display

Authoring

Interaction

Page 99: 2013 Lecture 5: AR Tools and Interaction

AR Interaction

  Designing AR System = Interface Design   Using different input and output technologies

  Objective is a high quality of user experience   Ease of use and learning   Performance and satisfaction

Page 100: 2013 Lecture 5: AR Tools and Interaction

User Interface and Tool   Human User Interface/Tool Machine/Object   Human Machine Interface

© Andreas Dünser

Tools

User Interface

Page 101: 2013 Lecture 5: AR Tools and Interaction

User Interface: Characteristics   Input: mono or multimodal   Output: mono or multisensorial   Technique/Metaphor/Paradigm

© Andreas Dünser

Input

Output Sensation of movement

Metaphor: “Push” to accelerate “Turn” to rotate

Page 102: 2013 Lecture 5: AR Tools and Interaction

Human Computer Interface   Human User Interface Computer System   Human Computer Interface=

Hardware +| Software   Computer is everywhere now HCI electronic

devices, Home Automation, Transport vehicles, etc

© Andreas Dünser

Page 103: 2013 Lecture 5: AR Tools and Interaction

More terminology   Interaction Device= Input/Output of User

Interface   Interaction Style= category of similar

interaction techniques   Interaction Paradigm   Modality (human sense)   Usability

Page 104: 2013 Lecture 5: AR Tools and Interaction

Back to AR   You can see spatially registered AR..

how can you interact with it?

Page 105: 2013 Lecture 5: AR Tools and Interaction

Interaction Tasks   2D (from [Foley]):

  Selection, Text Entry, Quantify, Position   3D (from [Bowman]):

 Navigation (Travel/Wayfinding)   Selection  Manipulation   System Control/Data Input

  AR: 2D + 3D Tasks and.. more specific tasks?

[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D., V. Wallace & P. Chan. IEEE Computer Graphics and Applications(Nov.): 13-48. 1984. [Bowman]: 3D User Interfaces: Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005

Page 106: 2013 Lecture 5: AR Tools and Interaction

AR Interfaces as Data Browsers

  2D/3D virtual objects are registered in 3D   “VR in Real World”

  Interaction   2D/3D virtual viewpoint

control

  Applications   Visualization, training

Page 107: 2013 Lecture 5: AR Tools and Interaction

AR Information Browsers   Information is registered to

real-world context   Hand held AR displays

  Interaction   Manipulation of a window

into information space   Applications

  Context-aware information displays

Rekimoto, et al. 1997

Page 108: 2013 Lecture 5: AR Tools and Interaction

Architecture

Page 109: 2013 Lecture 5: AR Tools and Interaction

Current AR Information Browsers   Mobile AR

  GPS + compass

  Many Applications   Layar   Wikitude   Acrossair   PressLite   Yelp   AR Car Finder   …

Page 110: 2013 Lecture 5: AR Tools and Interaction

Junaio   AR Browser from Metaio

  http://www.junaio.com/

  AR browsing  GPS + compass   2D/3D object placement   Photos/live video  Community viewing

Page 111: 2013 Lecture 5: AR Tools and Interaction
Page 112: 2013 Lecture 5: AR Tools and Interaction

Web Interface

Page 113: 2013 Lecture 5: AR Tools and Interaction

Adding Models in Web Interface

Page 114: 2013 Lecture 5: AR Tools and Interaction

Advantages and Disadvantages

  Important class of AR interfaces  Wearable computers   AR simulation, training

  Limited interactivity  Modification of virtual

content is difficult

Rekimoto, et al. 1997

Page 115: 2013 Lecture 5: AR Tools and Interaction

3D AR Interfaces

  Virtual objects displayed in 3D physical space and manipulated   HMDs and 6DOF head-tracking   6DOF hand trackers for input

  Interaction   Viewpoint control   Traditional 3D user interface

interaction: manipulation, selection, etc.

Kiyokawa, et al. 2000

Page 116: 2013 Lecture 5: AR Tools and Interaction

AR 3D Interaction

Page 117: 2013 Lecture 5: AR Tools and Interaction

AR Graffiti

www.nextwall.net

Page 118: 2013 Lecture 5: AR Tools and Interaction

Advantages and Disadvantages   Important class of AR interfaces

  Entertainment, design, training

  Advantages   User can interact with 3D virtual

object everywhere in space   Natural, familiar interaction

  Disadvantages   Usually no tactile feedback   User has to use different devices for

virtual and physical objects Oshima, et al. 2000

Page 119: 2013 Lecture 5: AR Tools and Interaction

Augmented Surfaces and Tangible Interfaces   Basic principles

  Virtual objects are projected on a surface

  Physical objects are used as controls for virtual objects

  Support for collaboration

Page 120: 2013 Lecture 5: AR Tools and Interaction

Augmented Surfaces

  Rekimoto, et al. 1998   Front projection  Marker-based tracking  Multiple projection surfaces

Page 121: 2013 Lecture 5: AR Tools and Interaction

Tangible User Interfaces (Ishii 97)   Create digital shadows

for physical objects   Foreground

  graspable UI

  Background   ambient interfaces

Page 122: 2013 Lecture 5: AR Tools and Interaction

Tangible Interfaces - Ambient   Dangling String

  Jeremijenko 1995   Ambient ethernet monitor   Relies on peripheral cues

  Ambient Fixtures   Dahley, Wisneski, Ishii 1998   Use natural material qualities

for information display

Page 123: 2013 Lecture 5: AR Tools and Interaction

Tangible Interface: ARgroove   Collaborative Instrument   Exploring Physically Based Interaction

  Map physical actions to Midi output -  Translation, rotation -  Tilt, shake

Page 124: 2013 Lecture 5: AR Tools and Interaction

ARgroove in Use

Page 125: 2013 Lecture 5: AR Tools and Interaction

Visual Feedback   Continuous Visual Feedback is Key   Single Virtual Image Provides:

  Rotation   Tilt  Height

Page 126: 2013 Lecture 5: AR Tools and Interaction

i/O Brush (Ryokai, Marti, Ishii)

Page 127: 2013 Lecture 5: AR Tools and Interaction

Other Examples   Triangles (Gorbert 1998)

  Triangular based story telling

  ActiveCube (Kitamura 2000-)  Cubes with sensors

Page 128: 2013 Lecture 5: AR Tools and Interaction

Lessons from Tangible Interfaces   Physical objects make us smart

 Norman’s “Things that Make Us Smart”   encode affordances, constraints

  Objects aid collaboration   establish shared meaning

  Objects increase understanding   serve as cognitive artifacts

Page 129: 2013 Lecture 5: AR Tools and Interaction

TUI Limitations

  Difficult to change object properties   can’t tell state of digital data

  Limited display capabilities   projection screen = 2D   dependent on physical display surface

  Separation between object and display   ARgroove

Page 130: 2013 Lecture 5: AR Tools and Interaction

Advantages and Disadvantages

  Advantages  Natural - users hands are used for interacting

with both virtual and real objects. -  No need for special purpose input devices

  Disadvantages   Interaction is limited only to 2D surface

-  Full 3D interaction and manipulation is difficult

Page 131: 2013 Lecture 5: AR Tools and Interaction

Orthogonal Nature of AR Interfaces

Page 132: 2013 Lecture 5: AR Tools and Interaction

Back to the Real World

  AR overcomes limitation of TUIs   enhance display possibilities  merge task/display space   provide public and private views

  TUI + AR = Tangible AR   Apply TUI methods to AR interface design

Page 133: 2013 Lecture 5: AR Tools and Interaction

  Space-multiplexed   Many devices each with one function

-  Quicker to use, more intuitive, clutter -  Real Toolbox

  Time-multiplexed   One device with many functions

-  Space efficient -  mouse

Page 134: 2013 Lecture 5: AR Tools and Interaction

Tangible AR: Tiles (Space Multiplexed)

  Tiles semantics   data tiles   operation tiles

  Operation on tiles   proximity   spatial arrangements   space-multiplexed

Page 135: 2013 Lecture 5: AR Tools and Interaction

Space-multiplexed Interface

Data authoring in Tiles

Page 136: 2013 Lecture 5: AR Tools and Interaction

Proximity-based Interaction

Page 137: 2013 Lecture 5: AR Tools and Interaction

Object Based Interaction: MagicCup   Intuitive Virtual Object Manipulation

on a Table-Top Workspace

  Time multiplexed  Multiple Markers

-  Robust Tracking

  Tangible User Interface -  Intuitive Manipulation

  Stereo Display -  Good Presence

Page 138: 2013 Lecture 5: AR Tools and Interaction

Our system

 Main table, Menu table, Cup interface

Page 139: 2013 Lecture 5: AR Tools and Interaction
Page 140: 2013 Lecture 5: AR Tools and Interaction

Tangible AR: Time-multiplexed Interaction

  Use of natural physical object manipulations to control virtual objects

  VOMAR Demo  Catalog book:

-  Turn over the page

  Paddle operation: -  Push, shake, incline, hit, scoop

Page 141: 2013 Lecture 5: AR Tools and Interaction

VOMAR Interface

Page 142: 2013 Lecture 5: AR Tools and Interaction

Advantages and Disadvantages

  Advantages   Natural interaction with virtual and physical tools

-  No need for special purpose input devices

  Spatial interaction with virtual objects -  3D manipulation with virtual objects anywhere in physical

space

  Disadvantages   Requires Head Mounted Display

Page 143: 2013 Lecture 5: AR Tools and Interaction

Wrap-up   Browsing Interfaces

  simple (conceptually!), unobtrusive

  3D AR Interfaces   expressive, creative, require attention

  Tangible Interfaces   Embedded into conventional environments

  Tangible AR  Combines TUI input + AR display

Page 144: 2013 Lecture 5: AR Tools and Interaction

AR User Interface: Categorization

  Traditional Desktop: keyboard, mouse, joystick (with or without 2D/3D GUI)

  Specialized/VR Device: 3D VR devices, specially design device

Page 145: 2013 Lecture 5: AR Tools and Interaction

AR User Interface: Categorization   Tangible Interface : using physical object Hand/

Touch Interface : using pose and gesture of hand, fingers

  Body Interface: using movement of body

Page 146: 2013 Lecture 5: AR Tools and Interaction

AR User Interface: Categorization   Speech Interface: voice, speech control   Multimodal Interface : Gesture + Speech   Haptic Interface : haptic feedback   Eye Tracking, Physiological, Brain Computer

Interface..

Page 147: 2013 Lecture 5: AR Tools and Interaction

Resources

Page 148: 2013 Lecture 5: AR Tools and Interaction

Websites

  Software Download   http://artoolkit.sourceforge.net/

  ARToolKit Documentation   http://www.hitl.washington.edu/artoolkit/

  ARToolKit Forum   http://www.hitlabnz.org/wiki/Forum

  ARToolworks Inc   http://www.artoolworks.com/

Page 149: 2013 Lecture 5: AR Tools and Interaction

  ARToolKit Plus   http://studierstube.icg.tu-graz.ac.at/handheld_ar/

artoolkitplus.php

  osgART   http://www.osgart.org/

  FLARToolKit   http://www.libspark.org/wiki/saqoosha/FLARToolKit/

  FLARManager   http://words.transmote.com/wp/flarmanager/

Page 150: 2013 Lecture 5: AR Tools and Interaction

Project Assignment   Design/Related work exercise   Individual

  Each person find 2 relevant papers/videos/websites  Write two page literature review

  As a team - prototype design   Sketch out the user interface of application  Design the interaction flow/Screen mockups   3 minute Presentation in class August 16th