2013 lecture 5: ar tools and interaction
DESCRIPTION
Lecture 5 from the COSC 426 Graduate course on Augmented Reality. This lecture talks about AR development tools and interaction styles. Taught by Mark Billinghurst from the HIT Lab NZ at the University of Canterbury. August 9th 2013TRANSCRIPT
COSC 426: Augmented Reality
Mark Billinghurst
August 9th 2013
Lecture 5: AR Tools and Interaction
Interaction Design Process
Data Gathering Techniques (1)
Questionnaires Looking for specific information Qualitative and quantitative results Good for getting data from a large, dispersed group
Interviews Good for exploring issues, using props Structured, unstructured or semi-structured But are time consuming and difficult to visit everyone
Data Gathering Techniques (2)
Workshops or focus groups Group interviews/activities Good at gaining a consensus view and/or highlighting areas
of conflict Observations Spending time with users in day to day tasks Good for understanding task context requires time and can result in a huge amount of data
Documentation Procedures and rules written down in manuals
Elaboration and Reduction Elaborate - generate solutions. These are the opportunities Reduce - decide on the ones worth pursuing Repeat - elaborate and reduce again on those solutions
Source: Laseau,P. (1980) Graphic Thinking for Architects & Designers. John Wiley and Sons
Tools for Effective Design
Personas Scenarios Storyboards Wireframes and Mock-ups Prototypes
Persona Technique • Personas are a design tool to help visualize who you are
designing for and imagine how person will use the product • A persona is an archetype that represents the behavior and
goals of a group of users • Based on insights and observations from customer research • Not real people, but synthesised from real user characteristics • Bring them to life with a name, characteristics, goals, background • Develop multiple personas
Gunther the Ad Guy
Gunther is from Germany. He Travels extensively for work and As he is an advertising executive he needs to present concepts to clients quickly and easily. He is a person very well-versed in new technologies and wishes he had easier portable solutions for his presentations…..
Scenarios Usage Scenarios are narrative descriptions of how the product
meets the needs of a persona
Short (2 pages max) Focus on unmet needs of persona Concrete story Set of stories around essential tasks, problems... Use to test ideas
Storyboarding
Sequence of sketches showing use of system in everyday use context
Concrete example Easier (faster) to grasp than text based stories Means of communication with users and system
developers Sketches, not drawings... Use to test interaction and make sure design works
Sketching is about design Sketching is not about drawing It is about design.
Sketching is a tool to help you: - express - develop, and - communicate design ideas
Sketching is part of a process: - idea generation, - design elaboration - design choices, - engineering
Sketch vs. Prototype Sketch Prototype
Invite A)end
Suggest Describe
Explore Refine
Ques;on Answer
Propose Test
Provoke Resolve
Tenta;ve, non commi)al Specific Depic;on
The primary differences are in the intent
Types of Prototypes
Low Fidelity – quick and dirty, easy access materials like cardboard and paper.
High Fidelity – more involved electronic versions similar in materials to final product.
RAPID Prototyping
Fast and inexpensive Identifies problems before they’re coded Elicits more and better feedback from users Helps developers think creatively Gets users and other stakeholders involved early
Fosters teamwork and communication Avoids opinion wars Helps decide design directions
Paper Prototyping (Low Fidelity)
Quick and simple means of sketching interfaces Use office materials Easier to criticize, quick to change Creative process (develop in team) Can also use for usability test (focus on flow of interaction rather than visuals) Used a lot to test out concepts before real design begins.
Paper Prototyping
High-fidelity prototyping • Uses materials that you would expect to be in the
final product. • Prototype looks more like the final system than a
low-fidelity version. • For a high-fidelity software prototype common
environments include Macromedia Director, Visual Basic, and Smalltalk.
• Danger that users think they have a full system…….see compromises
Rapid Prototyping
Speed development time with quick hardware mockups handheld device connected to PC LCD screen, USB phone keypad, Camera
Can use PC development tools for rapid development Flash, Visual Basic, etc
AR Tools
experiences
applications
tools
components
Sony CSL © 2004
Building Compelling AR Experiences
Tracking, Display
Authoring
AR Authoring Tools Low Level Software Libraries
osgART, Studierstube, MXRToolKit Plug-ins to existing software
DART (Macromedia Director), mARx, Unity, Stand Alone
AMIRE, BuildAR, Metaio Creator etc Rapid Prototyping Tools
Flash, OpenFrameworks, Processing, Arduino, etc Next Generation
iaTAR (Tangible AR)
ARToolKit (Kato 1998)
Open source – computer vision based AR tracking http://artoolkit.sourceforge.net/
ARToolKit Structure
Three key libraries: AR32.lib – ARToolKit image processing functions ARgsub32.lib – ARToolKit graphics functions ARvideo.lib – DirectShow video capture class
DirectShow
ARvideo.lib
Software
Cross platform Windows, Mac, Linux, IRIX, Symbian, iPhone, etc
Additional basic libraries Video capture library (Video4Linux, VisionSDK) OpenGL, GLUT
Requires a rendering library Open VRML, Open Inventor, osgART, etc
Additional Software
ARToolKit just provides tracking For an AR application you’ll need more software
High level rendering library Open VRML, Open Inventor, osgART, etc
Audio Library Fmod, etc
Peripheral support
What does ARToolKit Calculate? Position of makers in the camera coordinates Pose of markers in the camera coordinates Output format
3x4 matrix format to represent the transformation matrix from the marker coordinates to the camera coordinates
Coordinate Systems
From Marker To Camera Rotation & Translation
TCM : 4x4 transformation matrix from marker coord. to camera coord.
An ARToolKit Application Initialization
Load camera and pattern parameters Main Loop
Step1. Image capture and display Step2. Marker detection Step3. Marker identification Step4. Getting pose information Step5. Object Interactions/Simulation Step6. Display virtual objects
End Application Camera shut down
Sample1.c Main Function main()!{!!init();!!argMainLoop( mouseEvent, !
! !keyEvent, mainLoop); !}!
Sample1.c - mainLoop Function if( dataPtr = (ARUint8 *) arVideoGetImage()) == NULL ) {
arUtilSleep(2); return;
} argDrawMode2D(); argDispImage(dataPtr, 0, 0 ); arVideoCapNext(); argSwapBuffers();
Ex. 2: Detecting a Marker Program : sample2.c Key points
Threshold value Important external variables arDebug – keep thresholded image arImage – pointer for thresholded image arImageProcMode – use 50% image for image
processing - AR_IMAGE_PROC_IN_FULL - AR_IMAGE_PROC_IN_HALF
Sample2.c – marker detection
/* detect the markers in the video frame */ if(arDetectMarker(dataPtr, thresh, &marker_info, &marker_num) < 0 ) { cleanup(); exit(0); } for( i = 0; i < marker_num; i++ ) { argDrawSquare(marker_info[i].vertex,0,0); }
Making a pattern template Use of utility program:
mk_patt.exe Show the pattern Put the corner of red line
segments on the left-top vertex of the marker
Pattern stored as a template in a file
1:2:1 ratio determines the pattern region used
Ex. 4 – Getting 3D information
Program : sample4.c Key points
Definition of a real marker Transformation matrix
- Rotation component - Translation component
Sample4.c – Transformation matrix
double marker_center[2] = {0.0, 0.0}; double marker_width = 80.0; double marker_trans[3][4];
arGetTransMat(&marker_info[i], marker_center, marker_width, marker_trans);
Finding the Camera Position This function sets transformation matrix from marker
to camera into marker_trans[3][4]." arGetTransMat(&marker_info[k], marker_center,
marker_width, marker_trans);
You can see the position information in the values of marker_trans[3][4]."" Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3];
ARToolKit Coordinate Frame
Ex. 5- Virtual Object Display
Program : sample5.c Key points
OpenGL parameter setting Setup of projection matrix Setup of modelview matrix
Appending your own OpenGL code
Set the camera parameters to OpenGL Projection matrix. argDrawMode3D(); argDraw3dCamera( 0, 0 );
Set the transformation matrix from the marker to the camera to the OpenGL ModelView matrix. argConvGlpara(marker_trans, gl_para); glMatrixMode(GL_MODELVIEW); glLoadMatrixd( gl_para );
After calling these functions, your OpenGL objects are drawn in the real marker coordinates.
ARToolKit in the World
Hundreds of projects Large research community
ARToolKit Family
ARToolKit ARToolKit NFT
ARToolKit (Symbian)
NyToolKit - Java, C#, - Android, WM
JARToolKit (Java)
FLARToolKit (Flash)
FLARManager (Flash)
FLARToolKit Flash AS3 Version of the ARToolKit
(was ported from NyARToolkit the Java Version of the ARToolkit)
enables augmented reality in the Browser uses Papervision3D for as 3D Engine available at http://saqoosha.net/ dual license, GPL and commercial license
FLARToolkit
Papervision 3D
Adobe Flash
AR Application Components
private function mainEnter(e:Event):void { /* Capture video frame*/ capture.draw(vid);
/* Detect marker */ if (detector.detectMarkerLite(raster, 80) && detector.getConfidence() > 0.5) { //Get the transfomration matrix for the current marker position detector.getTransformMatrix(trans);
//Translates and rotates the mainContainer so it looks right mainContainer.setTransformMatrix(trans);
//Render the papervision scene renderer.render(); } }
FLARToolKit Examples
Papervision 3D http://www.papervision3d.org/ Flash-based 3D-Engine Supports
import of 3D Models texturing animation scene graph
alternatives: Away3d, Sandy,…
Source Packages „Original“ FLARToolkit (Libspark, Saqoosha) (
http://www.libspark.org/svn/as3/FLARToolKit/trunk/ )
Start-up-guides Saqoosha (http://saqoosha.net/en/flartoolkit/start-up-guide/ )
Miko Haapoja (http://www.mikkoh.com/blog/?p=182 )
„Frameworks“ Squidder MultipleMarker – Example (
http://www.squidder.com/2009/03/06/flar-how-to-multiple-instances-of-multiple-markers/ )
FLARManager (http://words.transmote.com/wp/flarmanager/ )
Other Languages NyARToolKit
http://nyatla.jp/nyartoolkit/wp/ AS3, Java, C#, Processing, Unity, etc
openFrameworks http://www.openframeworks.cc/ https://sites.google.com/site/ofauckland/examples/8-artoolkit-example
Support for other libraries - Kinect, Audio, Physics, etc
void testApp::update(){ //capture video and detect markers mov.update(); if (mov.isFrameNew()) { img.setFromPixels(mov.getPixels(), ofGetWidth(), ofGetHeight()); gray = img; tracker.setFromPixels(gray.getPixels()); } } //-------------------------------------------------------------- void testApp::draw(){ //draw AR objects ofSetColor(0xffffff); mov.draw(0, 0); for (int i=0; i<tracker.markers.size(); i++) { ARMarkerInfo &m = tracker.markers[i]; tracker.loadMarkerModelViewMatrix(m); ofSetColor(255, 255, 0, 100); ofCircle(0,0,25); ofSetColor(0); ofDrawBitmapString(ofToString(m.id),0,0); } }
Low Level Mobile AR Tools Vuforia Tracking Library (Qualcomm)
Vuforia.com iOS, Android Computer vision based tracking Marker tracking, 3D objects, frame
markers Integration with Unity
Interaction, model loading
OSGART Programming Library Integration of ARToolKit with a High-Level
Rendering Engine (OpenSceneGraph) OSGART= OpenSceneGraph + ARToolKit
Supporting Geometric + Photometric Registration
osgART:Features
C++ (but also Python, Lua, etc). Multiple Video Input supports:
Direct (Firewire/USB Camera), Files, Network by ARvideo, PtGrey, CVCam, VideoWrapper, etc.
Benefits of Open Scene Graph Rendering Engine, Plug-ins, etc
mARx Plug-in
3D Studio Max Plug-in Can model and view AR content at the same time
BuildAR
http://www.buildar.co.nz/ Stand alone application Visual interface for AR model viewing application Enables non-programmers to build AR scenes
Metaio Creator
Drag and drop AR http://www.metaio.com/creator/
Total Immersion D’Fusion Studio Complete commercial authoring platform
http://www.t-immersion.com/ Multi-platform Markerless tracking Scripting Face tracking Finger tracking Kinect support
Others AR-Media
http://www.inglobetechnologies.com/ Google sketch-up plug-in
LinceoVR http://linceovr.seac02.it/ AR/VR authoring package
Libraries JARToolKit, MXRToolKit, ARLib, Goblin XNA
More Libraries
JARToolKit MRToolKit, MXRToolKit, ARLib, OpenVIDIA DWARF, Goblin XNA AMIRE D’Fusion
Advanced Authoring: iaTAR (Lee 2004)
Immersive AR Authoring Using real objects to create AR applications
osgART
Developing Augmented Reality Applications with osgART
What is a Scene Graph? Tree-like structure for organising a virtual world
e.g. VRML
Hierarchy of nodes that define: Groups (and Switches, Sequences etc…) Transformations Projections Geometry …
And states and attributes that define: Materials and textures Lighting and blending …
Scene Graph Example
Benefits of a Scene Graph Performance
Structuring data facilitates optimization
- Culling, state management, etc…
Abstraction Underlying graphics pipeline is
hidden Low-level programming (“how do I
display this?”) replaced with high-level concepts (“what do I want to display?”)
Image: sgi
About Open Scene Graph http://www.openscenegraph.org/ Open-source scene graph implementation Based on OpenGL Object-oriented C++ following design pattern principles Used for simulation, games, research, and industrial projects Active development community
Maintained by Robert Osfield ~2000 mailing list subscribers Documentation project: www.osgbooks.com
Uses the OSG Public License (similar to LGPL)
About Open Scene Graph (2)
Pirates of the XXI Century Flightgear
3DVRII Research Institute EOR
SCANeR
VRlab Umeå University
Open Scene Graph Features Plugins for loading and saving
3D: 3D Studio (.3ds), OpenFlight (.flt), Wavefront (.obj)… 2D: .png, .jpg, .bmp, QuickTime movies
NodeKits to extend functionality e.g. osgShadow
Cross-platform support for: Window management (osgViewer) Threading (OpenThreads)
Open Scene Graph Architecture
Plugins read and write 2D image and 3D model files
NodeKits extend core functionality, exposing higher-level node types
Scene graph and rendering
functionality
Inter-operability with other environments,
e.g. Python
Some Open Scene Graph Demos
You may want to get the OSG data package: Via SVN: http://www.openscenegraph.org/svn/osg/OpenSceneGraph-Data/trunk
osgviewer osgmotionblur osgparticle
osgreflect osgdistortion osgfxbrowser
Learning OSG Check out the Quick Start Guide
Free PDF download at http://osgbooks.com/, Physical copy $13US
Join the mailing list: http://www.openscenegraph.org/projects/osg/wiki/MailingLists
Browse the website: http://www.openscenegraph.org/projects/osg Use the forum: http://forum.openscenegraph.org
Study the examples Read the source?
osgART
What is osgART? osgART adds AR to Open Scene Graph Further developed and enhanced by:
Julian Looser Hartmut Seichter Raphael Grasset
Current version 2.0, Open Source http://www.osgart.org
osgART Approach: Basic Scene Graph
Root
Transform
3D Object
0.988 -0.031 -0.145 0 -0.048 0.857 -0.512 0 0.141 0.513 0.846 0 10.939 29.859 -226.733 1 [ ]
To add Video see-through AR: Integrate live video Apply correct projection matrix Update tracked transformations in
realtime
osgART Approach: AR Scene Graph Root
Transform
3D Object
osgART Approach: AR Scene Graph
Video Geode
Root
Transform
3D Object
Virtual Camera
Video Layer
osgART Approach: AR Scene Graph
Video Geode
Root
Transform
3D Object
Virtual Camera
Projection matrix from tracker calibration
Transformation matrix updated from marker tracking in realtime Video
Layer Full-screen quad with live texture updated from Video source
Orthographic projection
osgART Approach: AR Scene Graph
Video Geode
Root
Transform
3D Object
Virtual Camera
Projection matrix from tracker calibration
Transformation matrix updated from marker tracking in realtime Video
Layer Full-screen quad with live texture updated from Video source
Orthographic projection
osgART Architecture Like any video see-through AR library, osgART requires video
input and tracking capabilities.
AR
Library
Application
Video Source e.g. DirectShow
Tracking Module (libAR.lib)
osgART Architecture osgART uses a plugin architecture so that video sources and tracking
technologies can be plugged in as necessary
osgART
Application
Video Plugin
Tracker Plugin ARToolKit4 -
ARToolkitPlus - MXRToolKit -
ARLib - bazAR (work in progress) - ARTag (work in progress) -
OpenCVVideo - VidCapture - CMU1394 -
PointGrey SDK - VidereDesign -
VideoWrapper - VideoInput -
VideoSource - DSVL -
Intranel RTSP -
Basic osgART Tutorial Develop a working osgART application from scratch. Use ARToolKit 2.72 library for tracking and video
capture
osgART Tutorial 1: Basic OSG Viewer Start with the standard Open Scene Graph Viewer We will modify this to do AR!
osgART Tutorial 1: Basic OSG Viewer The basic osgViewer… #include <osgViewer/Viewer> #include <osgViewer/ViewerEventHandlers>
int main(int argc, char* argv[]) {
// Create a viewer osgViewer::Viewer viewer;
// Create a root node osg::ref_ptr<osg::Group> root = new osg::Group;
// Attach root node to the viewer viewer.setSceneData(root.get());
// Add relevant event handlers to the viewer viewer.addEventHandler(new osgViewer::StatsHandler); viewer.addEventHandler(new osgViewer::WindowSizeHandler); viewer.addEventHandler(new osgViewer::ThreadingHandler); viewer.addEventHandler(new osgViewer::HelpHandler);
// Run the viewer and exit the program when the viewer is closed return viewer.run();
}
osgART Tutorial 2: Adding Video Add a video plugin
Load, configure, start video capture…
Add a video background Create, link to video, add to scene-graph
osgART Tutorial 2: Adding Video New code to load and configure a Video Plugin:
// Preload the video and tracker int _video_id = osgART::PluginManager::getInstance()->load("osgart_video_artoolkit2");
// Load a video plugin. osg::ref_ptr<osgART::Video> video = dynamic_cast<osgART::Video*>(osgART::PluginManager::getInstance()->get(_video_id));
// Check if an instance of the video stream could be created if (!video.valid()) { // Without video an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not initialize video plugin!" << std::endl; exit(-1); }
// Open the video. This will not yet start the video stream but will // get information about the format of the video which is essential // for the connected tracker. video->open();
osgART Tutorial 2: Adding Video New code to add a live video background
osg::ref_ptr<osg::Group> videoBackground = createImageBackground(video.get()); videoBackground->getOrCreateStateSet()->setRenderBinDetails(0, "RenderBin");
root->addChild(videoBackground.get()); video->start();
osg::Group* createImageBackground(osg::Image* video) { osgART::VideoLayer* _layer = new osgART::VideoLayer(); _layer->setSize(*video); osgART::VideoGeode* _geode = new osgART::VideoGeode(osgART::VideoGeode::USE_TEXTURE_2D, video); addTexturedQuad(*_geode, video->s(), video->t()); _layer->addChild(_geode); return _layer; }
In the main function…
osgART Tutorial 3: Tracking Add a Tracker plugin
Load, configure, link to video Add a Marker to track
Load, activate Tracked node
Create, link with marker via tracking callbacks Print out the tracking data
osgART Tutorial 3: Tracking int _tracker_id = osgART::PluginManager::getInstance()->load("osgart_tracker_artoolkit2");
osg::ref_ptr<osgART::Tracker> tracker = dynamic_cast<osgART::Tracker*>(osgART::PluginManager::getInstance()->get(_tracker_id));
if (!tracker.valid()) { // Without tracker an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not initialize tracker plugin!" << std::endl; exit(-1); }
// get the tracker calibration object osg::ref_ptr<osgART::Calibration> calibration = tracker->getOrCreateCalibration();
// load a calibration file if (!calibration->load("data/camera_para.dat")) { // the calibration file was non-existing or couldnt be loaded osg::notify(osg::FATAL) << "Non existing or incompatible calibration file" << std::endl; exit(-1); }
// set the image source for the tracker tracker->setImage(video.get());
osgART::TrackerCallback::addOrSet(root.get(), tracker.get());
// create the virtual camera and add it to the scene osg::ref_ptr<osg::Camera> cam = calibration->createCamera(); root->addChild(cam.get());
Load a tracking plugin and associate it with the video plugin
osgART Tutorial 3: Tracking
osg::ref_ptr<osgART::Marker> marker = tracker->addMarker("single;data/patt.hiro;80;0;0"); if (!marker.valid()) { // Without marker an AR application can not work. Quit if none found. osg::notify(osg::FATAL) << "Could not add marker!" << std::endl; exit(-1); }
marker->setActive(true);
osg::ref_ptr<osg::MatrixTransform> arTransform = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransform.get(), marker.get()); cam->addChild(arTransform.get());
Load a marker and activate it Associate it with a transformation node (via event callbacks) Add the transformation node to the virtual camera node
osgART::addEventCallback(arTransform.get(), new osgART::MarkerDebugCallback(marker.get()));
Add a debug callback to print out information about the tracked marker
osgART Tutorial 3: Tracking
Tracking information is output to console
osgART Tutorial 4: Adding Content Now put the tracking data to use! Add content to the tracked transform Basic cube code
arTransform->addChild(osgART::testCube()); arTransform->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin");
osgART Tutorial 5: Adding 3D Model Open Scene Graph can load some 3D formats directly:
e.g. Wavefront (.obj), OpenFlight (.flt), 3D Studio (.3ds), COLLADA Others need to be converted
Support for some formats is much better than others e.g. OpenFlight good, 3ds hit and miss.
Recommend native .osg and .ive formats .osg – ASCII representation of scene graph .ive – Binary OSG file. Can contain hold textures.
osgExp : Exporter for 3DS Max is a good choice http://sourceforge.net/projects/osgmaxexp
Otherwise .3ds files from TurboSquid can work
osgART Tutorial 5: Adding 3D Model
std::string filename = "media/hollow_cube.osg"; arTransform->addChild(osgDB::readNodeFile(filename));
Replace the simple cube with a 3D model Models are loaded using the osgDB::readNodeFile() function
Note: Scale is important. Units are in mm.
3D Studio Max
Export to .osg
osgART
osgART Tutorial 6: Multiple Markers Repeat the process so far to track more than
one marker simultaneously
osgART Tutorial 6: Multiple Markers
osg::ref_ptr<osg::MatrixTransform> arTransformA = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransformA.get(), markerA.get()); arTransformA->addChild(osgDB::readNodeFile("media/hitl_logo.osg")); arTransformA->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin"); cam->addChild(arTransformA.get());
osg::ref_ptr<osg::MatrixTransform> arTransformB = new osg::MatrixTransform(); osgART::attachDefaultEventCallbacks(arTransformB.get(), markerB.get()); arTransformB->addChild(osgDB::readNodeFile("media/gist_logo.osg")); arTransformB->getOrCreateStateSet()->setRenderBinDetails(100, "RenderBin"); cam->addChild(arTransformB.get());
Load and activate two markers
osg::ref_ptr<osgART::Marker> markerA = tracker->addMarker("single;data/patt.hiro;80;0;0"); markerA->setActive(true);
osg::ref_ptr<osgART::Marker> markerB = tracker->addMarker("single;data/patt.kanji;80;0;0"); markerB->setActive(true);
Create two transformations, attach callbacks, and add models
Repeat the process so far to track more than one marker
osgART Tutorial 6: Multiple Markers
Basic osgART Tutorial: Summary
Standard OSG Viewer Addition of Video Addition of Tracking
Addition of basic 3D graphics
Addition of 3D Model Multiple Markers
AR Interaction
experiences
applications
tools
components
Sony CSL © 2004
Building Compelling AR Experiences
Tracking, Display
Authoring
Interaction
AR Interaction
Designing AR System = Interface Design Using different input and output technologies
Objective is a high quality of user experience Ease of use and learning Performance and satisfaction
User Interface and Tool Human User Interface/Tool Machine/Object Human Machine Interface
© Andreas Dünser
Tools
User Interface
User Interface: Characteristics Input: mono or multimodal Output: mono or multisensorial Technique/Metaphor/Paradigm
© Andreas Dünser
Input
Output Sensation of movement
Metaphor: “Push” to accelerate “Turn” to rotate
Human Computer Interface Human User Interface Computer System Human Computer Interface=
Hardware +| Software Computer is everywhere now HCI electronic
devices, Home Automation, Transport vehicles, etc
© Andreas Dünser
More terminology Interaction Device= Input/Output of User
Interface Interaction Style= category of similar
interaction techniques Interaction Paradigm Modality (human sense) Usability
Back to AR You can see spatially registered AR..
how can you interact with it?
Interaction Tasks 2D (from [Foley]):
Selection, Text Entry, Quantify, Position 3D (from [Bowman]):
Navigation (Travel/Wayfinding) Selection Manipulation System Control/Data Input
AR: 2D + 3D Tasks and.. more specific tasks?
[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D., V. Wallace & P. Chan. IEEE Computer Graphics and Applications(Nov.): 13-48. 1984. [Bowman]: 3D User Interfaces: Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
AR Interfaces as Data Browsers
2D/3D virtual objects are registered in 3D “VR in Real World”
Interaction 2D/3D virtual viewpoint
control
Applications Visualization, training
AR Information Browsers Information is registered to
real-world context Hand held AR displays
Interaction Manipulation of a window
into information space Applications
Context-aware information displays
Rekimoto, et al. 1997
Architecture
Current AR Information Browsers Mobile AR
GPS + compass
Many Applications Layar Wikitude Acrossair PressLite Yelp AR Car Finder …
Junaio AR Browser from Metaio
http://www.junaio.com/
AR browsing GPS + compass 2D/3D object placement Photos/live video Community viewing
Web Interface
Adding Models in Web Interface
Advantages and Disadvantages
Important class of AR interfaces Wearable computers AR simulation, training
Limited interactivity Modification of virtual
content is difficult
Rekimoto, et al. 1997
3D AR Interfaces
Virtual objects displayed in 3D physical space and manipulated HMDs and 6DOF head-tracking 6DOF hand trackers for input
Interaction Viewpoint control Traditional 3D user interface
interaction: manipulation, selection, etc.
Kiyokawa, et al. 2000
AR 3D Interaction
AR Graffiti
www.nextwall.net
Advantages and Disadvantages Important class of AR interfaces
Entertainment, design, training
Advantages User can interact with 3D virtual
object everywhere in space Natural, familiar interaction
Disadvantages Usually no tactile feedback User has to use different devices for
virtual and physical objects Oshima, et al. 2000
Augmented Surfaces and Tangible Interfaces Basic principles
Virtual objects are projected on a surface
Physical objects are used as controls for virtual objects
Support for collaboration
Augmented Surfaces
Rekimoto, et al. 1998 Front projection Marker-based tracking Multiple projection surfaces
Tangible User Interfaces (Ishii 97) Create digital shadows
for physical objects Foreground
graspable UI
Background ambient interfaces
Tangible Interfaces - Ambient Dangling String
Jeremijenko 1995 Ambient ethernet monitor Relies on peripheral cues
Ambient Fixtures Dahley, Wisneski, Ishii 1998 Use natural material qualities
for information display
Tangible Interface: ARgroove Collaborative Instrument Exploring Physically Based Interaction
Map physical actions to Midi output - Translation, rotation - Tilt, shake
ARgroove in Use
Visual Feedback Continuous Visual Feedback is Key Single Virtual Image Provides:
Rotation Tilt Height
i/O Brush (Ryokai, Marti, Ishii)
Other Examples Triangles (Gorbert 1998)
Triangular based story telling
ActiveCube (Kitamura 2000-) Cubes with sensors
Lessons from Tangible Interfaces Physical objects make us smart
Norman’s “Things that Make Us Smart” encode affordances, constraints
Objects aid collaboration establish shared meaning
Objects increase understanding serve as cognitive artifacts
TUI Limitations
Difficult to change object properties can’t tell state of digital data
Limited display capabilities projection screen = 2D dependent on physical display surface
Separation between object and display ARgroove
Advantages and Disadvantages
Advantages Natural - users hands are used for interacting
with both virtual and real objects. - No need for special purpose input devices
Disadvantages Interaction is limited only to 2D surface
- Full 3D interaction and manipulation is difficult
Orthogonal Nature of AR Interfaces
Back to the Real World
AR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private views
TUI + AR = Tangible AR Apply TUI methods to AR interface design
Space-multiplexed Many devices each with one function
- Quicker to use, more intuitive, clutter - Real Toolbox
Time-multiplexed One device with many functions
- Space efficient - mouse
Tangible AR: Tiles (Space Multiplexed)
Tiles semantics data tiles operation tiles
Operation on tiles proximity spatial arrangements space-multiplexed
Space-multiplexed Interface
Data authoring in Tiles
Proximity-based Interaction
Object Based Interaction: MagicCup Intuitive Virtual Object Manipulation
on a Table-Top Workspace
Time multiplexed Multiple Markers
- Robust Tracking
Tangible User Interface - Intuitive Manipulation
Stereo Display - Good Presence
Our system
Main table, Menu table, Cup interface
Tangible AR: Time-multiplexed Interaction
Use of natural physical object manipulations to control virtual objects
VOMAR Demo Catalog book:
- Turn over the page
Paddle operation: - Push, shake, incline, hit, scoop
VOMAR Interface
Advantages and Disadvantages
Advantages Natural interaction with virtual and physical tools
- No need for special purpose input devices
Spatial interaction with virtual objects - 3D manipulation with virtual objects anywhere in physical
space
Disadvantages Requires Head Mounted Display
Wrap-up Browsing Interfaces
simple (conceptually!), unobtrusive
3D AR Interfaces expressive, creative, require attention
Tangible Interfaces Embedded into conventional environments
Tangible AR Combines TUI input + AR display
AR User Interface: Categorization
Traditional Desktop: keyboard, mouse, joystick (with or without 2D/3D GUI)
Specialized/VR Device: 3D VR devices, specially design device
AR User Interface: Categorization Tangible Interface : using physical object Hand/
Touch Interface : using pose and gesture of hand, fingers
Body Interface: using movement of body
AR User Interface: Categorization Speech Interface: voice, speech control Multimodal Interface : Gesture + Speech Haptic Interface : haptic feedback Eye Tracking, Physiological, Brain Computer
Interface..
Resources
Websites
Software Download http://artoolkit.sourceforge.net/
ARToolKit Documentation http://www.hitl.washington.edu/artoolkit/
ARToolKit Forum http://www.hitlabnz.org/wiki/Forum
ARToolworks Inc http://www.artoolworks.com/
ARToolKit Plus http://studierstube.icg.tu-graz.ac.at/handheld_ar/
artoolkitplus.php
osgART http://www.osgart.org/
FLARToolKit http://www.libspark.org/wiki/saqoosha/FLARToolKit/
FLARManager http://words.transmote.com/wp/flarmanager/
Project Assignment Design/Related work exercise Individual
Each person find 2 relevant papers/videos/websites Write two page literature review
As a team - prototype design Sketch out the user interface of application Design the interaction flow/Screen mockups 3 minute Presentation in class August 16th