Transcript
Page 1: Context-aware / Multimodal UI Breakout Summary

Context-aware / Multimodal UI Breakout Summary

James A. Landay, et. al.

HCC RetreatJuly 7, 2000

Page 2: Context-aware / Multimodal UI Breakout Summary

7/5/2000 2

Participants

James Landay Anoop Sinha Jimmy Lin Trevor Perring Greg Heinzinger Chris Long Ed Chi

Christine Halverson Gian Gonzaga Ken Fishkin John Lowe Adam Janin Russell Eames Elin Pedersen

Page 3: Context-aware / Multimodal UI Breakout Summary

7/5/2000 3

Applications

Alert management* sites beacon context

+“this is a quiet place, no interruptions please”+e.g., movie theater or restaurant

* devices use context to avoid interruptions* wearable t-shirt that jams local cell phones!

“Elvis has left the meeting”* easily share documents from meetings* beam tokens of documents to participants or* use shared context to find docs later

+e.g., I was in a meeting with Ken at Lake Tahoe, find docs

Page 4: Context-aware / Multimodal UI Breakout Summary

7/5/2000 4

Context Events

Signal changes* like a windowing system

Can use as triggers to cause other actions* change my phone forwarding when I change

locations

Can be immediate or logged for later tacit information mining

Page 5: Context-aware / Multimodal UI Breakout Summary

7/5/2000 5

Context Implementation Issues

Apps need to share context easily* build-in to apps like cut & paste* context dial tone or infrastructure

Global file system* easier to share context & not have to transfer it* just use pointers

How to search / browse* computers are good at searching large spaces* humans good at making associations

Why not search with Google instead of browser history?* Google easier to get at & seems to work well

Page 6: Context-aware / Multimodal UI Breakout Summary

7/5/2000 6

Context Toolkits / APIs / Refs

Bill Schilit’s Columbia / PARC Ph.D. GA Tech GVU (Anind Dey) IBM (Maria Ebling) MIT (?) ESPIRIT projects have looked at

context* German? project according to Elin Pedersen

Page 7: Context-aware / Multimodal UI Breakout Summary

7/5/2000 7

Interface Between Context & Multimodal UIs

“Context is just another kind of input”* different to the user, but similar to the system* user input caused by EXPLICIT user action* context is IMPLICT or DERRIVED

Context and multimodal UIs have similar privacy problems* natural inputs are human “readable”* may not want to share context or my input

Page 8: Context-aware / Multimodal UI Breakout Summary

7/5/2000 8

Interface Between Context & Multimodal UIs

Context to choose output modality* e.g., user is in a meeting, don’t use speech

Context to disambiguate input(s)* help fusion: there is noise, don’t rely on speech* “the clutching problem” – infer user intent

Modality used to help inter context* e.g., talking to device -> user is alone?

Page 9: Context-aware / Multimodal UI Breakout Summary

7/5/2000 9

Initial Design for Multimodal UI Design Tool

Create “rough cuts”* informal (sketching / “Wizard of Oz”)* iterative design (user testing/fast

mods)

Infer models from design* designer can augment model over time

Generate initial prototypes* UIs for multiple devices* designer adds detail / improve UI

+or even removes detail

Model


Top Related