let’s shake hands! on the coordination of gestures of humanoids

28
Let’s shake hands! On the coordination of gestures of humanoids Zsofi Ruttkay Herwin van Welbergen Balázs Varga

Upload: alika-dyer

Post on 02-Jan-2016

34 views

Category:

Documents


1 download

DESCRIPTION

Let’s shake hands! On the coordination of gestures of humanoids. Zsofi Ruttkay Herwin van Welbergen Balázs Varga. Our goals. Coordinating gesture to external signals Coordinating gesture to other modalities Comparison of synch phenomena of rhythmic motion and speech-accompanying gesture - PowerPoint PPT Presentation

TRANSCRIPT

Let’s shake hands!

On the coordination of gestures of humanoids

Zsofi RuttkayHerwin van Welbergen

Balázs Varga

welberge
30 min. spreektijd

Our goals Coordinating gesture to external signals Coordinating gesture to other

modalities Comparison of synch phenomena of

rhythmic motion and speech-accompanying gesture

Define a synchronization language Create an adaptive real-time animation

engine

Content Examples of coordination The multimodal coordination problem

Existing solutions Our solution

Coordinating gesture To what? How to adapt gesture for coordination? How to specify coordination?

Conclusions Open issues Questions

Coordination example:Gesture-speech coordination

Gestures and speech come from a single process of utterance formation (McNeill) => Gesture timing is not a slave of speech timing

Time alignment is achieved while we speak The stroke of gesture precedes or ends at the

phonological peak syllable of the speech Often we need to adjust the timing of gestures or

speech to make the alignment fit Gestures can be speeded up or slowed down Gestures can be ‘paused’ using hold phases Speech can be stopped to finish complex gestures Connection words (uh…) can be used to slow down speech Etc

welberge
in human conv.

Coordination example:The virtual dancer: moving to the music

‘Beat moments’ in the animation should be aligned to beats in the music

Annotate all beat moments in the animation Predict the beats in the music Locally speed up or slow down the animation to fit to the music

There is a maximum stretch or skew in the dance motion

Coordination example:The virtual trainer: tutoring exercises on music

Exercises are executed using several modalities Body movement Speech Music/metronome Sound (clap, foot tap)

Challenges Coordination Monitoring user => real time (re)planning

Exaggeration to point out details Speed up / slow down Feedback/correction …

Coordination example:Handshake: coordination between 2 humans

Handshake is used for greeting, agreeing and accepting

Complex coordination between two persons

Guided by Social protocols Haptic feedback Visual feedback

Generalizing: the multimodal coordination problem

‘Behaviors’ on different modalities (speech, gesture, dance motion, music)

Synchronization between behaviors at key time moments

The timing of flexible behaviors can be adapted to achieve synchronization

Coordination: related work Classic approach in speech/gesture coordination:

Speech leads, gesture follows MURML (Kopp et al.)

No leading modality Planning in sequential chunks containing one piece of

speech and one aligned gesture Co-articulation at the border of chunks

BML (Kopp, Krenn, Marsella, Marshall, Pelachaud, Pirker, Thórisson, Vilhjalmsson)

No leading modality Synchronized alignment points in behavior phases For now, aimed mainly at speech/gesture synchronization In development

Coordination: our previous work Virtual Dancer

Synchronization between music (beats) and dance animation

Linear time stretching/skewing Virtual Presenter

Synchronization between speech, gesture, posture and sheet display

Leading modality can change over time GESTYLE markup language with par/seq

and wait constructs to define synchronization

Our multimodal synchronization model No leading modality, just align key moments Every phase of a behavior has a preferred

length Stretching/skewing/skipping if necessary

Coordination of hand gestures to external signals

What do we want to coordinate with?

How do synchronization constraints effect movement? How to stretch/skew?

How can we define synchronization? using BML scripts?

Ontology of coordination signals

Flexibility

Origin

Fixed Flexible

World - pointing at a moving object- clapping to rhythm of music

-

Humanoid’s own modality

- gesture aligned to speech which is taken as leading signal

- gaze and hand coordination

Other humanoid’s modality

- back-channeling as listener to a speaker e.g. by head nods

- hand shake- two hands

involved in taking over an object

Clapping experiment

Clapping and counting How is the synchrony between clap and count? How do the movement characteristics of

clapping change with tempo? Time distribution Amplitude Left/right hand symmetry

Clapping experiment: setup

Mocap analysis of two subjects

Instructions: Clap and count

from 21 to 31 Clap and count

to the metronome

Clapping experiment: results The phonological synchrony rule was valid for

counting while clapping: Clap before phonological peak of the count

The clapping was speeded up by decreasing the path distance of the hand

A pre-stroke hold can be used to slow down For our right-handed subjects, the right hand

was moving ahead in phase compared to the left

The standard deviation of the relative phase between the left and the right hand increased with the clapping frequency

Hand shake experiment Which movement phases can be

identified? How are they coordinated? What gaze patterns can be seen? What movement characteristics can be

identified in the different phases? Timing, duration Form

How is above effected by Refusal or avoidance to shake hands Social relations between participants

Hand shake experiment: setup Motion capture of two subjects (P1, P2)

shaking hands Annotation of gaze patterns Variations

Basic Triggered P2 initiates P2 tries to avoid shaking hands P2 rejects

Modeling coordination in BML

BML is an XML language defining multimodal synchronization

BML events can be used to synchronize with other BML scripts/world events

The BMLT observer is introduced for synchronization with (repeated) outside world events

Coordination with events BML is designed to work in event driven

systems <event> is used to fire an event

message <wait> is used to wait for an event

If the event does not occur after a set time, wait can fire a no-event message

After the event occurs, or the timeout is exceeded, the script continues

Coordination with events:handshake

extend withdrawpumpconnect

extend withdrawpumpconnect

wait

timeout

subject 1

subject 2

Coordination with the observer An observer observes a specific part of

the world and provides timing information on that Example: beats in music

Why observers instead of events? Explicit outside world trigger Multiple (repeated) trigger Timing of observer triggers can be predicted

for easy planning Synchronization within behavior phases <wait> does not suffice

Coordination with the observerClapping

welberge
Show 1 or 2 clapping movies here

Conclusions

Gesture synchronization mechanisms are also found in rhythmic motion

Adaptation of timing in gesture affects several movement characteristics Linear speedup/slowdown does not suffice

Gesture coordination can be modeled using BMLT

Open issues

What modalities do we have to stretch/skew/skip?

Can we generalize our findings from clapping/handshake?

Do the semantics of a motion change if we change its timing? E.g. emotions, individual features

Questions

Easter eggs

Synchronization with observer

Declaration: <observer id="beatObserver1"/>

Synchronization to beat 1<animation stroke="beatObserver1:1">

Synchronization to closest beat/all beats <animation stroke="beatObserver1">