426 lecture 7: designing ar interfaces
DESCRIPTION
Lecture 7 in the 201TRANSCRIPT
![Page 1: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/1.jpg)
COSC 426: Augmented Reality
Mark Billinghurst
Sept 5th 2012
Lecture 7: Designing AR Interfaces
![Page 2: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/2.jpg)
AR Interfaces Browsing Interfaces
simple (conceptually!), unobtrusive
3D AR Interfaces expressive, creative, require attention
Tangible Interfaces Embedded into conventional environments
Tangible AR Combines TUI input + AR display
![Page 3: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/3.jpg)
AR Interfaces as Data Browsers
2D/3D virtual objects are registered in 3D “VR in Real World”
Interaction 2D/3D virtual viewpoint
control
Applications Visualization, training
![Page 4: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/4.jpg)
3D AR Interfaces
Virtual objects displayed in 3D physical space and manipulated HMDs and 6DOF head-tracking 6DOF hand trackers for input
Interaction Viewpoint control Traditional 3D user interface
interaction: manipulation, selection, etc.
Kiyokawa, et al. 2000
![Page 5: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/5.jpg)
Augmented Surfaces and Tangible Interfaces Basic principles
Virtual objects are projected on a surface
Physical objects are used as controls for virtual objects
Support for collaboration
![Page 6: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/6.jpg)
Tangible Interfaces - Ambient Dangling String
Jeremijenko 1995 Ambient ethernet monitor Relies on peripheral cues
Ambient Fixtures Dahley, Wisneski, Ishii 1998 Use natural material qualities
for information display
![Page 7: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/7.jpg)
Back to the Real World
AR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private views
TUI + AR = Tangible AR Apply TUI methods to AR interface design
![Page 8: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/8.jpg)
Space-multiplexed Many devices each with one function
- Quicker to use, more intuitive, clutter - Real Toolbox
Time-multiplexed One device with many functions
- Space efficient - mouse
![Page 9: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/9.jpg)
Tangible AR: Tiles (Space Multiplexed)
Tiles semantics data tiles operation tiles
Operation on tiles proximity spatial arrangements space-multiplexed
![Page 10: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/10.jpg)
Tangible AR: Time-multiplexed Interaction
Use of natural physical object manipulations to control virtual objects
VOMAR Demo Catalog book:
- Turn over the page
Paddle operation: - Push, shake, incline, hit, scoop
![Page 11: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/11.jpg)
experiences
applications
tools
components
Sony CSL © 2004
Building Compelling AR Experiences
Tracking, Display
Authoring
Interaction
![Page 12: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/12.jpg)
Interface Design Path
1/ Prototype Demonstration
2/ Adoption of Interaction Techniques from other interface metaphors
3/ Development of new interface metaphors appropriate to the medium
4/ Development of formal theoretical models for predicting and modeling user actions
Desktop WIMP
Virtual Reality
Augmented Reality
![Page 13: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/13.jpg)
Interface metaphors Designed to be similar to a physical entity but also has own
properties e.g. desktop metaphor, search engine
Exploit user’s familiar knowledge, helping them to understand ‘the unfamiliar’
Conjures up the essence of the unfamiliar activity, enabling users to leverage of this to understand more aspects of the unfamiliar functionality
People find it easier to learn and talk about what they are doing at the computer interface in terms familiar to them
![Page 14: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/14.jpg)
Example: The spreadsheet Analogous to ledger
sheet Interactive and
computational Easy to understand Greatly extending
what accountants and others could do
www.bricklin.com/history/refcards.htm
![Page 15: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/15.jpg)
Why was it so good? It was simple, clear, and obvious to the users how to
use the application and what it could do “it is just a tool to allow others to work out their
ideas and reduce the tedium of repeating the same calculations.”
capitalized on user’s familiarity with ledger sheets Got the computer to perform a range of different
calculations in response to user input
![Page 16: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/16.jpg)
Another classic 8010 Star office system targeted at workers not
interested in computing per se Spent several person-years at beginning working out
the conceptual model Simplified the electronic world, making it seem more
familiar, less alien, and easier to learn
Johnson et al (1989)
![Page 17: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/17.jpg)
The Star interface
![Page 18: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/18.jpg)
Benefits of interface metaphors Makes learning new systems easier Helps users understand the underlying
conceptual model Can be innovative and enable the realm of
computers and their applications to be made more accessible to a greater diversity of users
![Page 19: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/19.jpg)
Problems with interface metaphors (Nielson, 1990)
Break conventional and cultural rules e.g., recycle bin placed on desktop
Can constrain designers in the way they conceptualize a problem Conflict with design principles Forces users to only understand the system in terms of the
metaphor Designers can inadvertently use bad existing designs and transfer
the bad parts over Limits designers’ imagination with new conceptual models
![Page 20: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/20.jpg)
![Page 21: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/21.jpg)
Microsoft Bob
![Page 22: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/22.jpg)
![Page 23: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/23.jpg)
PSDoom – killing processes
![Page 24: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/24.jpg)
Interface Components Physical components Display elements
- Visual/audio
Interaction metaphors
Physical Elements
Display Elements Interaction
Metaphor Input Output
AR Design Principles
![Page 25: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/25.jpg)
Back to the Real World
AR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private views
TUI + AR = Tangible AR Apply TUI methods to AR interface design
![Page 26: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/26.jpg)
AR Design Space
Reality Virtual Reality
Augmented Reality
Physical Design Virtual Design
![Page 27: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/27.jpg)
Tangible AR Design Principles
Tangible AR Interfaces use TUI principles Physical controllers for moving virtual content Support for spatial 3D interaction techniques Time and space multiplexed interaction Support for multi-handed interaction Match object affordances to task requirements Support parallel activity with multiple objects Allow collaboration between multiple users
![Page 28: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/28.jpg)
Space-multiplexed Many devices each with one function
- Quicker to use, more intuitive, clutter - Tiles Interface, toolbox
Time-multiplexed One device with many functions
- Space efficient - VOMAR Interface, mouse
![Page 29: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/29.jpg)
Design of Objects Objects
Purposely built – affordances “Found” – repurposed Existing – already at use in marketplace
Make affordances obvious (Norman) Object affordances visible Give feedback Provide constraints Use natural mapping Use good cognitive model
![Page 30: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/30.jpg)
Object Design
![Page 31: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/31.jpg)
Affordances: to give a clue Refers to an attribute of an object that allows people to
know how to use it e.g. a mouse button invites pushing, a door handle affords
pulling
Norman (1988) used the term to discuss the design of everyday objects
Since has been much popularised in interaction design to discuss how to design interface objects e.g. scrollbars to afford moving up and down, icons to afford
clicking on
![Page 32: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/32.jpg)
"...the term affordance refers to the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used. [...] Affordances provide strong clues to the operations of things. Plates are for pushing. Knobs are for turning. Slots are for inserting things into. Balls are for throwing or bouncing. When affordances are taken advantage of, the user knows what to do just by looking: no picture, label, or instruction needed." (Norman, The Psychology of Everyday Things 1988, p.9)
![Page 33: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/33.jpg)
Physical Affordances Physical affordances:
How do the following physical objects afford? Are they obvious?
![Page 34: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/34.jpg)
‘Affordance’ and Interface Design? Interfaces are virtual and do not have affordances
like physical objects
Norman argues it does not make sense to talk about interfaces in terms of ‘real’ affordances
Instead interfaces are better conceptualized as ‘perceived’ affordances Learned conventions of arbitrary mappings between
action and effect at the interface Some mappings are better than others
![Page 35: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/35.jpg)
Virtual Affordances Virtual affordances
How do the following screen objects afford? What if you were a novice user? Would you know what to do with them?
![Page 36: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/36.jpg)
AR is mixture of physical affordance and virtual affordance
Physical Tangible controllers and objects
Virtual Virtual graphics and audio
![Page 37: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/37.jpg)
Case Study 1: 3D AR Lens
Goal: Develop a lens based AR interface
MagicLenses Developed at Xerox PARC in 1993 View a region of the workspace differently to the rest Overlap MagicLenses to create composite effects
![Page 38: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/38.jpg)
3D MagicLenses
MagicLenses extended to 3D (Veiga et. al. 96) Volumetric and flat lenses
![Page 39: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/39.jpg)
AR Lens Design Principles Physical Components
Lens handle - Virtual lens attached to real object
Display Elements Lens view
- Reveal layers in dataset
Interaction Metaphor Physically holding lens
![Page 40: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/40.jpg)
3D AR Lenses: Model Viewer Displays models made up of multiple parts Each part can be shown or hidden through the lens Allows the user to peer inside the model Maintains focus + context
![Page 41: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/41.jpg)
AR Lens Demo
![Page 42: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/42.jpg)
AR Lens Implementation
Stencil Buffer Outside Lens
Inside Lens Virtual Magnifying Glass
![Page 43: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/43.jpg)
AR FlexiLens
Real handles/controllers with flexible AR lens
![Page 44: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/44.jpg)
Techniques based on AR Lenses
Object Selection Select objects by targeting them with the lens
Information Filtering Show different representations through the lens Hide certain content to reduce clutter, look inside things
Move between AR and VR Transition along the reality-virtuality continuum Change our viewpoint to suit our needs
![Page 45: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/45.jpg)
Case Study 2 : LevelHead
Block based game
![Page 46: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/46.jpg)
Case Study 2: LevelHead
Physical Components Real blocks
Display Elements Virtual person and rooms
Interaction Metaphor Blocks are rooms
![Page 47: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/47.jpg)
![Page 48: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/48.jpg)
Case Study 3: AR Chemistry (Fjeld 2002)
Tangible AR chemistry education
![Page 49: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/49.jpg)
Goal: An AR application to test molecular structure in chemistry
Physical Components Real book, rotation cube, scoop, tracking markers
Display Elements AR atoms and molecules
Interaction Metaphor Build your own molecule
![Page 50: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/50.jpg)
AR Chemistry Input Devices
![Page 51: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/51.jpg)
![Page 52: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/52.jpg)
Case Study 4: Transitional Interfaces
Goal: An AR interface supporting transitions from reality to virtual reality
Physical Components Real book
Display Elements AR and VR content
Interaction Metaphor Book pages hold virtual scenes
![Page 53: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/53.jpg)
Milgram’s Continuum (1994)
Reality (Tangible Interfaces)
Virtuality (Virtual Reality)
Augmented Reality (AR)
Augmented Virtuality (AV)
Mixed Reality (MR)
Central Hypothesis The next generation of interfaces will support transitions
along the Reality-Virtuality continuum
![Page 54: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/54.jpg)
Transitions
Interfaces of the future will need to support transitions along the RV continuum
Augmented Reality is preferred for: co-located collaboration
Immersive Virtual Reality is preferred for: experiencing world immersively (egocentric) sharing views remote collaboration
![Page 55: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/55.jpg)
The MagicBook
Design Goals: Allows user to move smoothly between reality
and virtual reality Support collaboration
![Page 56: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/56.jpg)
MagicBook Metaphor
![Page 57: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/57.jpg)
Features
Seamless transition between Reality and Virtuality Reliance on real decreases as virtual increases
Supports egocentric and exocentric views User can pick appropriate view
Computer becomes invisible Consistent interface metaphors Virtual content seems real
Supports collaboration
![Page 58: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/58.jpg)
Collaboration
Collaboration on multiple levels: Physical Object AR Object Immersive Virtual Space
Egocentric + exocentric collaboration multiple multi-scale users
Independent Views Privacy, role division, scalability
![Page 59: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/59.jpg)
Technology
Reality No technology
Augmented Reality Camera – tracking Switch – fly in
Virtual Reality Compass – tracking Press pad – move Switch – fly out
![Page 60: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/60.jpg)
Scientific Visualization
![Page 61: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/61.jpg)
Education
![Page 62: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/62.jpg)
Summary When designing AR interfaces, think of:
Physical Components - Physical affordances
Virtual Components - Virtual affordances
Interface Metaphors
![Page 63: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/63.jpg)
OSGART: From Registration to Interaction
![Page 64: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/64.jpg)
Keyboard and Mouse Interaction Traditional input techniques OSG provides a framework for handling keyboard
and mouse input events (osgGA) 1. Subclass osgGA::GUIEventHandler 2. Handle events:
• Mouse up / down / move / drag / scroll-wheel • Key up / down
3. Add instance of new handler to the viewer
![Page 65: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/65.jpg)
Keyboard and Mouse Interaction
class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {
public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }
virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) {
switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: break; case osgGA::GUIEventAdapter::KEYDOWN: break; }
return false; } };
viewer.addEventHandler(new KeyboardMouseEventHandler());
Create your own event handler class
Add it to the viewer to receive events
![Page 66: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/66.jpg)
Keyboard Interaction
case osgGA::GUIEventAdapter::KEYDOWN: {
switch (ea.getKey()) { case 'w': // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case 's': // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case 'a': // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case 'd': // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true; case ' ': // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; }
break;
Handle W,A,S,D keys to move an object
localTransform = new osg::MatrixTransform(); localTransform->addChild(osgDB::readNodeFile("media/car.ive")); arTransform->addChild(localTransform.get());
![Page 67: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/67.jpg)
Keyboard Interaction Demo
![Page 68: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/68.jpg)
Mouse Interaction Mouse is pointing device… Use mouse to select objects in an AR scene OSG provides methods for ray-casting and
intersection testing Return an osg::NodePath (the path from the hit
node all the way back to the root)
Projection Plane (screen) scene
![Page 69: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/69.jpg)
Mouse Interaction
case osgGA::GUIEventAdapter::PUSH:
osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections;
// Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) { targets[i]->setSelected(false); }
// Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) {
if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target->setSelected(true); return true; } } }
break;
Compute the list of nodes under the clicked position Invoke an action on nodes that are hit, e.g. select, delete
![Page 70: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/70.jpg)
Mouse Interaction Demo
![Page 71: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/71.jpg)
Proximity Techniques Interaction based on
the distance between a marker and the camera the distance between multiple markers
![Page 72: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/72.jpg)
Single Marker Techniques: Proximity Use distance from camera to marker as
input parameter e.g. Lean in close to examine
Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium
![Page 73: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/73.jpg)
Single Marker Techniques: Proximity // Load some models osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg"); osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg"); osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");
// Use a Level-Of-Detail node to show each model at different distance ranges. osg::ref_ptr<osg::LOD> lod = new osg::LOD(); lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m away lod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm away lod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away
arTransform->addChild(lod.get());
Define depth ranges for each node Add as many as you want Ranges can overlap
![Page 74: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/74.jpg)
Single Marker Proximity Demo
![Page 75: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/75.jpg)
Multiple Marker Concepts Interaction based on the relationship between
markers e.g. When the distance between two markers
decreases below threshold invoke an action Tangible User Interface
Applications: Memory card games File operations
![Page 76: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/76.jpg)
Multiple Marker Proximity
Transform A Transform B
Switch A Switch B
Model A1
Model A2
Model B1
Virtual Camera
Distance > Threshold
Model B2
![Page 77: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/77.jpg)
Multiple Marker Proximity
Transform A Transform B
Switch A Switch B
Model A1
Model A2
Model B1
Virtual Camera
Distance <= Threshold
Model B2
![Page 78: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/78.jpg)
Multiple Marker Proximity
virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {
if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) {
osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length();
if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); }
}
}
traverse(node,nv);
}
Use a node callback to test for proximity and update the relevant nodes
![Page 79: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/79.jpg)
Multiple Marker Proximity
![Page 80: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/80.jpg)
Paddle Interaction Use one marker as a tool for selecting and
manipulating objects (tangible user interface) Another marker provides a frame of reference
A grid of markers can alleviate problems with occlusion
MagicCup (Kato et al) VOMAR (Kato et al)
![Page 81: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/81.jpg)
Paddle Interaction Often useful to adopt a local coordinate system
Allows the camera to move without disrupting Tlocal
Places the paddle in the same coordinate system as the content on the grid Simplifies interaction
osgART computes Tlocal using the osgART::LocalTransformationCallback
![Page 82: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/82.jpg)
Tilt and Shake Interaction
Detect types of paddle movement: Tilt
- gradual change in orientation
Shake - short, sudden changes in translation
![Page 83: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/83.jpg)
Building Tangible AR Interfaces with ARToolKit
![Page 84: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/84.jpg)
Required Code Calculating Camera Position
Range to marker
Loading Multiple Patterns/Models Interaction between objects
Proximity Relative position/orientation
Occlusion Stencil buffering Multi-marker tracking
![Page 85: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/85.jpg)
Tangible AR Coordinate Frames
![Page 86: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/86.jpg)
Local vs. Global Interactions
Local Actions determined from single camera to marker
transform - shaking, appearance, relative position, range
Global Actions determined from two relationships
- marker to camera, world to camera coords. - Marker transform determined in world coordinates
• object tilt, absolute position, absolute rotation, hitting
![Page 87: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/87.jpg)
Range-based Interaction Sample File: RangeTest.c
/* get the camera transformation */ arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans);
/* find the range */ Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3]; range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
![Page 88: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/88.jpg)
Loading Multiple Patterns Sample File: LoadMulti.c
Uses object.c to load
Object Structure typedef struct { char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;
![Page 89: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/89.jpg)
Finding Multiple Transforms Create object list ObjectData_T *object;
Read in objects - in init( ) read_ObjData( char *name, int *objectnum );
Find Transform – in mainLoop( ) for( i = 0; i < objectnum; i++ ) { ..Check patterns ..Find transforms for each marker }
![Page 90: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/90.jpg)
Drawing Multiple Objects
Send the object list to the draw function draw( object, objectnum );
Draw each object individually for( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para); }
![Page 91: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/91.jpg)
Proximity Based Interaction
Sample File – CollideTest.c Detect distance between markers
checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction
![Page 92: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/92.jpg)
Multi-marker Tracking
Sample File – multiTest.c Multiple markers to establish a
single coordinate frame Reading in a configuration file Tracking from sets of markers Careful camera calibration
![Page 93: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/93.jpg)
Sample File - Data/multi/marker.dat
Contains list of all the patterns and their exact positions
#the number of patterns to be recognized 6
#marker 1 Data/multi/patt.a 40.0 0.0 0.0 1.0000 0.0000 0.0000 -100.0000 0.0000 1.0000 0.0000 50.0000 0.0000 0.0000 1.0000 0.0000 …
MultiMarker Configuration File
Pattern File
Pattern Width + Coordinate Origin
Pattern Transform Relative to Global Origin
![Page 94: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/94.jpg)
Camera Transform Calculation Include <AR/arMulti.h> Link to libARMulti.lib
In mainLoop() Detect markers as usual arDetectMarkerLite(dataPtr, thresh,
&marker_info, &marker_num)
Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info,
marker_num, config)) < 0 ) {
argSwapBuffers(); return; }
![Page 95: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/95.jpg)
Paddle-based Interaction
Tracking single marker relative to multi-marker set - paddle contains single marker
![Page 96: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/96.jpg)
Paddle Interaction Code Sample File – PaddleDemo.c
Get paddle marker location + draw paddle before drawing background model
paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, &cparam);
/* draw the paddle */ if( paddleInfo->active ){ draw_paddle( paddleInfo); }
draw_paddle uses a Stencil Buffer to increase realism
![Page 97: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/97.jpg)
Paddle Interaction Code II
Sample File – paddleDrawDemo.c
Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])
Sample File – paddleTouch.c Finds the paddle position:
findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);
Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
![Page 98: 426 lecture 7: Designing AR Interfaces](https://reader034.vdocuments.us/reader034/viewer/2022042607/54c80cd34a7959a1368b45df/html5/thumbnails/98.jpg)
General Tangible AR Library command_sub.c, command_sub.h Contains functions for recognizing a range of
different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( );
Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)