user-defined body gestures for an interactive storytelling ...€¦ · user-defined gestures for...

16
User-Defined Body Gestures for an Interactive Storytelling Scenario Human Centered Multimedia Institute of Computer Science Augsburg University Universitätsstr. 6a 86159 Augsburg, Germany Felix Kistler and Elisabeth André

Upload: others

Post on 07-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

User-Defined Body Gestures for an

Interactive Storytelling Scenario

Human Centered Multimedia

Institute of Computer Science

Augsburg University

Universitätsstr. 6a

86159 Augsburg, Germany

Felix Kistler and Elisabeth André

Page 2: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

• Kinect made integrating full body gesture easy

• But are full body gestures designed by developers really intuitive?

• Solution: Involve the user in the design process use process presented to create user-defined gestures for surface computing1

2

Motivation

1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson: User-defined gestures for surface computing, in Proc. CHI 2009.

Page 3: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

• User-defined gesture sets have already been used for various scenarios and interaction modalities1,2

• Our scenario:

– Postures and gestures with the whole body

– Trigger in-game actions in a story

3

Adapting the Process for Body Gestures

1 E. Kurdyukova, M. Redlin, and E. André: Studying user-defined iPad gestures for interaction in multi-display environment. In Proc. IUI 2012. 2 M. Obaid, M. Häring, F. Kistler, R. Bühling, and E. André. User-defined body gestures for navigational control of a humanoid robot. In Social

Robotics, Springer Berlin Heidelberg, 2012.

Page 4: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

An interactive storytelling scenario for intercultural training • Developed within the eCute project aiming for intercultural training (of young adults =18-25 year olds)1

• Virtual environment populated by virtual characters with culturally adaptive behaviors3

• Full body gestures with Kinect – Realized with our Full Body interaction Framework FUBI4

– Different gesture symbols represent different in-game actions

– Problem: First gestures were defined manually resulting in quite unintuitive mappings

Scenario: Traveller

1http://ecute.eu 2J. Dias, S. Mascarenhas, and A. Paiva: Fatima modular: Towards an agent architecture with a generic appraisal framework, in Proc. Workshop on

Standards for Emotion Modeling, 2011. 3F. Kistler, B. Endrass, I. Damian, C. Dang, and E. André. Natural interaction with culturally adaptive virtual characters. Journal on Multimodal User

Interfaces, 2012

4

Page 5: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

• At the state of the study, three of eleven planned scenes of the story were implemented

• Included ten actions for navigation and dialogue tasks:

yes, no, sit at bar and wait, approach group, ask for directions, leave the bar, ask about supervisor, ask guard to talk to supervisor, approach supervisor, ask permission

5

Scenes and In-Game Actions

1) Yes 2) No 3) Maybe

Page 6: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

Study for Creating User-Defined Gestures within Traveller

1. 22 participants run through the story without interaction enabled

2. Gesture symbols not displayed, but only texts of the available in-game actions

3. Users spontaneously invent and perform gestures for each of the displayed interaction options

4. Selection of gesture candidates for each interaction option by investigating the video recordings

5. Implementation of gesture candidates with FUBI using the video recordings

6

Page 7: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

• Selection of gesture candidates for each in-game action:

– Group proposed gestures into sets of identical gestures

– Largest group = first candidate

– Second largest group = second candidate

• Second candidate only considered if its size

is at least half the size of the first

7

Gesture Candidates

Page 8: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

8

Gestures Taxonomy

1 D. McNeill. Head and Mind: What Gestures Reveal About Thought. University of Chicago Press, 1992.

Form Static gesture A static body posture is held.

Dynamic gesture The gestures contains movement of one or more body parts.

Body parts

One hand The gesture is performed with one hand.

Two hands …with two hands.

Full body …with at least one other body part than the hands.

Gesture type1

Deictic The gesture is indicating a position or direction.

Iconic The gesture visually depicts the meant in-game action or a part of it directly.

Metaphoric The gestures visually depicts an icon and describes the in-game action in an abstract way.

• Existing taxonomies do not fit well

• Our taxonomy:

Page 9: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

• Many static postures • Many metaphorics (more complex actions compared to

surface computing or robot navigation) • Many gestures with other body parts than the hands

9

Gesture Taxonomy

55% 50% 41%

45%

28%

17%

22% 42%

0%

20%

40%

60%

80%

100%

form gesture type body parts

static

dynamic

deictic

iconic

metaphoric

one hand

two hands

full body

Page 10: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

10

User Ratings and Agreement

• User rated the easiness to invent a gesture after its performance

• Agreement scores represent the consensus between participants

• Strong correlation between easiness ratings and agreement scores (r=0.812) More same gestures when rated easier, more variation when rated more difficult

ask… actions have lowest agreement and were rated most difficult (significantly more difficult than all other actions except for sit at bar and wait)

0

1

2

3

4

5

6

Yes No Sit at barand wait

Approachgroup

Approachsupervisor

Leave bar Ask aboutsupervisor

Ask guard totalk to

supervisor

Ask fordirections

Askpermission

rated easiness

agreement (x10)

Page 11: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

Gestures Candidates

• (Mostly) direct representations

for the navigational actions

• More abstract ones for the

dialogue actions and frequent

arms out gesture

• Pointing only for one action

• Second candidate only for three

actions

• Reason for tip on shoulder

gesture for ask permission:

11

In-game action Gesture candidates

Occur-rences

yes head nod 68%

no head shake 68%

sit at bar and wait sit down 56%

approach group step forward 56%

ask for directions arms out 34%

leave bar turn away 45%

step backward 27%

ask about supervisor arms out 50%

ask guard to talk to supervisor

point at one after the other 38%

point to front 21%

approach supervisor step forward 56%

ask permission arms out 23%

tip on shoulder 19%

Page 12: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

• Gestures defined in XML

• Basic recognizer types: 1. Joint orientation recognizer:

minimum and maximum angle for a specific joint

2. Joint relation recognizer: joint position (in relation to a second joint)

3. Linear movement recognizer: direction and minimum/ maximum speed for the movement of a joint

• Combined to sequences in combination recognizers

– Several states with sets of basic recognizers

– Time constraints for state duration and transitions to define the flow

12

Integration with FUBI

• Kinect for Windows SDK for user tracking (kinectforwindows.org)

• First tests of the gesture candidates integration were promising

Page 13: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

Further Development

• Triggering actions with different full body gestures was sufficient for the three

scenes as implemented in the study

• In the meantime there are more scenes and in-game actions added

• We got problems with complex dialogue options, which could not be easily

represented with unambiguous gestures

Solution: new dialogue menu with swiping gestures

– Menu opens after performing the “ask gesture”

– Stretching the arm to the front activates the menu

– Swiping in direction of the wanted menu option selects it

• Integrates well with the other gestures and stays within the same modality

13

Page 14: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

Conclusion

• User-defined full body gestures help to create more intuitive

interaction for interactive storytelling

• Conducting the study is straight-forward

• Analysis can take some time

We only used this method for the first three scenes

to get a basis, additional gestures were added manually

• A second interaction type with the swipe menu

solved problems with complex dialog options

• Gesture implementation with the FUBI framework is feasible

• …but it is challenging to realize the gestures exactly as intended by

the users (limitations of the Kinect tracking, but also differences in

performances of the same gesture)

14

Page 15: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

Future Work

• Evaluation of user experience and

recognition accuracy

• …also for the manually added gestures

• …and for the new swipe menu

15

Acknowledgments: This work was funded by the European Commission within FP7 under grant agreement eCUTE (FP7-ICT-257666). We would like to thank our project partners for the collaboration, and especially Nick Degens, Hélio Mascarenhas, Samuel Mascarenhas, and André Silva for their work on the storyboard, agent architecture, and asset creation that formed the background for our study.

Page 16: User-Defined Body Gestures for an Interactive Storytelling ...€¦ · user-defined gestures for surface computing1 2 Motivation 1 J. O. Wobbrock, M. R. Morris, and A. D. Wilson:

16

R. Bühling - Adaptive Art

THANK YOU FOR YOUR ATTENTION!

QUESTIONS?

DON’T MISS THE DEMO TOMORROW 10:00-

11:00 NEXT TO THE POSTERS

FUBI IS FREELY AVAILABLE UNDER: http://hcm-lab.de/fubi.html