modeling and identification of emotional aspects of locomotion

7
Journal of Computational Science 4 (2013) 255–261 Contents lists available at SciVerse ScienceDirect Journal of Computational Science journa l h om epage: www.elsevier.com/locate/jocs Modeling and identification of emotional aspects of locomotion Martin L. Felis a,, Katja Mombaur a , Hideki Kadone b , Alain Berthoz c a Interdisciplinary Center for Scientific Computing (IWR), University of Heidelberg, INF 386, D-69120 Heidelberg, Germany b Center for Cybernics Research (CCR), University of Tsukuba, 1-1-1, Tennodai, Tsukuba, Japan c Collège de France, Laboratoire de Physiologie de la Perception et de l’Action (LPPA), 11, Place Marcelin Berthelot, 75231 Paris Cedex 05, France a r t i c l e i n f o Article history: Received 3 August 2011 Received in revised form 2 October 2012 Accepted 4 October 2012 Available online 22 October 2012 Keywords: Optimal control Emotion Rigid-body Forward dynamics a b s t r a c t The studies of emotional facial expressions and emotional body language are currently receiving a lot of attention in the cognitive sciences. In this project, we study implicit bodily expression of emotions during standard motions, such as walking forwards. An underlying assumption of our work is that all human motion is optimal in some sense and that different emotions induce different objective functions, which result in different deformations of normal motion. We created a 3D rigid-body model of a human of which we use the forward dynamics simulation in an optimal control context. We performed two kinds of optimizations: (i) reconstruction of dynamic quantities, such as joint torques, of pre-recorded data of emotional walking motions and (ii) forward optimization that generates neutral and varied walking motions using different objective functions. Opti- mizations are performed with the software package MUSCOD-II, which uses a direct multiple-shooting discretization scheme. The results of this work form the foundation for further analysis of emotional motions using inverse optimal control methods. © 2012 Elsevier B.V. All rights reserved. 1. Introduction We encounter emotions every day. Most of us “speak” and “read” emotional body language whether we want or not. Our perception is very sensitive to human motion and already small cues suffice for us to recognize gender and emotion of a walking person, with- out even seeing the face. Yet there is little known about why we express emotions in these specific ways or how emotions influence our motor system. Already Darwin [5] asked whether facial expressions of emo- tions are inherent or they are learned during lifetime. He also proposed that expressions which can be found in both humans and animals, such as the sneer, are due to a common genetic ancestors. More recently Ekman and Friesen [6] created the Facial Action Cod- ing System (FACS), which is able to categorize and encode almost every anatomically possible facial expression. It is actively used by animators or psychologists. A similar system for emotional body language does not exist. But to create one would be a tedious task due to the huge amount of muscles. But emotions are not only expressed in faces. Already simple visualizations such as point light displays of a walking person suf- fices for us to be able to recognize emotions Atkinson et al. [1]. This is demonstrated by the computer program “BML Walker” described Corresponding author. E-mail address: [email protected] (M.L. Felis). in Troje [20]. It allows specification of physical properties such as sex, weight, and also emotional aspects nervous-relaxed and happy-sad via sliders. A point light display of a motion that fits to the chosen parameter is generated on-the-fly. Generation of emo- tional motions therefore seems to be possible. But how can we use this to gain insights into emotions? Research of emotions is also being done in neuroscience. The somatic marker hypothesis [4] states that our decision making is influenced not only by logical reasoning, but also by our emotions. Furthermore, the perception of emotional body language is being done by neural systems that are responsible for perceiving action and triggering of behaviour [8]. This supports the idea that emo- tional body language is not only static information of the current state of a person, instead it also conveys cues about his or her action. From this perspective, it comes naturally to look at emotions in human motions. Kinematic analysis of emotional walking motions was done in Omlor and Giese [14]. They used motion capture data and used a new non-linear blind-source separation method. It efficiently finds only a few source components that approximate high-dimensional emotional walking. This allows one to simulate different emotional styles on a kinematic skeleton. Furthermore these spatio-temporal primitives are specific for different emotions and indicate emotion- specific primitives in the human motor system. Only little research of emotional body language has so far been done by using (rigid-body) dynamics. In Liu et al. [12] a learn- ing method was used to create stylized motions for physics-based 1877-7503/$ see front matter © 2012 Elsevier B.V. All rights reserved. http://dx.doi.org/10.1016/j.jocs.2012.10.001

Upload: alain

Post on 01-Jan-2017

214 views

Category:

Documents


1 download

TRANSCRIPT

M

Ma

b

c

a

ARRAA

KOERF

1

eifoeo

tpaMieald

vfii

1h

Journal of Computational Science 4 (2013) 255–261

Contents lists available at SciVerse ScienceDirect

Journal of Computational Science

journa l h om epage: www.elsev ier .com/ locate / jocs

odeling and identification of emotional aspects of locomotion

artin L. Felisa,∗, Katja Mombaura, Hideki Kadoneb, Alain Berthozc

Interdisciplinary Center for Scientific Computing (IWR), University of Heidelberg, INF 386, D-69120 Heidelberg, GermanyCenter for Cybernics Research (CCR), University of Tsukuba, 1-1-1, Tennodai, Tsukuba, JapanCollège de France, Laboratoire de Physiologie de la Perception et de l’Action (LPPA), 11, Place Marcelin Berthelot, 75231 Paris Cedex 05, France

r t i c l e i n f o

rticle history:eceived 3 August 2011eceived in revised form 2 October 2012ccepted 4 October 2012vailable online 22 October 2012

eywords:ptimal control

a b s t r a c t

The studies of emotional facial expressions and emotional body language are currently receiving a lotof attention in the cognitive sciences. In this project, we study implicit bodily expression of emotionsduring standard motions, such as walking forwards.

An underlying assumption of our work is that all human motion is optimal in some sense and thatdifferent emotions induce different objective functions, which result in different deformations of normalmotion.

We created a 3D rigid-body model of a human of which we use the forward dynamics simulation in

motionigid-bodyorward dynamics

an optimal control context. We performed two kinds of optimizations: (i) reconstruction of dynamicquantities, such as joint torques, of pre-recorded data of emotional walking motions and (ii) forwardoptimization that generates neutral and varied walking motions using different objective functions. Opti-mizations are performed with the software package MUSCOD-II, which uses a direct multiple-shootingdiscretization scheme. The results of this work form the foundation for further analysis of emotional

timal

motions using inverse op

. Introduction

We encounter emotions every day. Most of us “speak” and “read”motional body language whether we want or not. Our perceptions very sensitive to human motion and already small cues sufficeor us to recognize gender and emotion of a walking person, with-ut even seeing the face. Yet there is little known about why wexpress emotions in these specific ways or how emotions influenceur motor system.

Already Darwin [5] asked whether facial expressions of emo-ions are inherent or they are learned during lifetime. He alsoroposed that expressions which can be found in both humans andnimals, such as the sneer, are due to a common genetic ancestors.ore recently Ekman and Friesen [6] created the Facial Action Cod-

ng System (FACS), which is able to categorize and encode almostvery anatomically possible facial expression. It is actively used bynimators or psychologists. A similar system for emotional bodyanguage does not exist. But to create one would be a tedious taskue to the huge amount of muscles.

But emotions are not only expressed in faces. Already simple

isualizations such as point light displays of a walking person suf-ces for us to be able to recognize emotions Atkinson et al. [1]. This

s demonstrated by the computer program “BML Walker” described

∗ Corresponding author.E-mail address: [email protected] (M.L. Felis).

877-7503/$ – see front matter © 2012 Elsevier B.V. All rights reserved.ttp://dx.doi.org/10.1016/j.jocs.2012.10.001

control methods.© 2012 Elsevier B.V. All rights reserved.

in Troje [20]. It allows specification of physical properties suchas sex, weight, and also emotional aspects nervous-relaxed andhappy-sad via sliders. A point light display of a motion that fits tothe chosen parameter is generated on-the-fly. Generation of emo-tional motions therefore seems to be possible. But how can we usethis to gain insights into emotions?

Research of emotions is also being done in neuroscience. Thesomatic marker hypothesis [4] states that our decision making isinfluenced not only by logical reasoning, but also by our emotions.Furthermore, the perception of emotional body language is beingdone by neural systems that are responsible for perceiving actionand triggering of behaviour [8]. This supports the idea that emo-tional body language is not only static information of the currentstate of a person, instead it also conveys cues about his or her action.From this perspective, it comes naturally to look at emotions inhuman motions.

Kinematic analysis of emotional walking motions was done inOmlor and Giese [14]. They used motion capture data and used anew non-linear blind-source separation method. It efficiently findsonly a few source components that approximate high-dimensionalemotional walking. This allows one to simulate different emotionalstyles on a kinematic skeleton. Furthermore these spatio-temporalprimitives are specific for different emotions and indicate emotion-

specific primitives in the human motor system.

Only little research of emotional body language has so far beendone by using (rigid-body) dynamics. In Liu et al. [12] a learn-ing method was used to create stylized motions for physics-based

2 putational Science 4 (2013) 255–261

afepest

omsotmeipor

2

ao

2

pslhsbOdta

Fmftaao

2

lamT

mrbpm

wlsa

Table 1Objective function candidates and their proposed contribution to the emotions ofinterest.

Objective function Anger Fear Sadness Joy

Time min min min –Energy max – min maxAcceleration max min min –Jerk max min – –Ground impacts max min – –Angular amplitutes max min min –Manipulability measure – max – –

56 M.L. Felis et al. / Journal of Com

nimation system. It allows one to learn a parameter vector �rom motion-capture data, which can then be applied to differ-nt motions. The vector � contains informations such as elasticityarameters for shoe contact, muscles and tendons and also prefer-nces for certain muscles. The method was used to learn emotionaltyles, but focus of this research is on generation of stylized anima-ions and not on emotional body language itself.

With our research, we want to explore emotional body languagen a kinetic level. One of the assumptions of our work is that humanotions are optimal in some sense. Optimization has already been

uccessfully used to create human like motions such as running [19]r platform diving [9]. We therefore use a model-based approacho combine kinematic motion-capture recordings with a rigid-body

odel by using optimal control methods. This allows us to look atmotional body language on a level of forces and torques that actn or on the body. One of the key questions we have is, whether it isossible to relate certain emotions to specific goals in the form ofbjective functions. In this paper we want to show our preliminaryesults that will form the base for future work.

. Emotional human walking

In our research we investigate full-body 3D walking gaits thatre varied by emotions. However to be able to discuss alterationsf a walk, we first have to describe a neutral walking gait.

.1. Description of a neutral gait

The walk cycle can be split up in single and double supporthase. During the single support phase there is only one leg (thetance leg) in contact with the ground, while the other leg (swingeg) is moved forward. The double support phase starts when theeel of the swing leg touches the ground and lasts until the formertance leg loses contact after rolling over the toes. During the dou-le support phase the weight is shifted from one leg to the other.ne also has to note that the legs, feet and hip form a kinematic loopuring double support phase. This can afford special treatment inhe dynamics simulation. There is an anti-cyclic swinging of therms which also contributes to the efficiency of human gait [3].

A schematic of a single step of human walking is shown inig. 1(a). In this paper, the gaits are assumed to be left–right sym-etric. This means that the flexions and extensions are the same

or the swinging right leg during the left leg stance phase as forhe swinging left leg during the right leg stance phase. A gener-lization to non-symmetric gaits would be possible, however at

higher computational cost. We therefore focus the analysis andptimization on a single step instead of a full stride.

.2. Emotional gaits

There is a wide range of models for emotions in psychology. Aot of them try to postulate a small set of basic emotions whichre then combined to build the complete emotional space. Famousodels are by Ekman and Friesen [7] or Plutchik [17]. In Ortony and

urner [15] various models are reviewed.Even though these models would allow us to build or explain

ore complex emotions, they are not directly relevant to thisesearch. We are not trying to validate a given model or certainasic emotions, instead we want to gain insights into physicalhenomena of emotions and how they are embodied in our loco-otor system.Quantifying a certain emotion per se is not possible, therefore

e have to settle with qualitative measurements. In our case weook at motions carried out by actors who are experienced in con-ciously expressing emotional body language. They also have thebility to perform a wide range of emotions ad hoc.

Muscle co-contraction max – – minMuscle activation – – min –

In this research we focus on a subset of emotions namely anger,joy, sadness, and neutral. Expressions of the emotion sadness is themost distinct one, as it is slower and the head inclination differsfrom the others [18]. The emotions anger and joy are harder to dis-criminate from kinematics alone, as both feature fast motions ofarms and often an upright posture.

2.3. Emotional motions and optimization

In our current research project we want to investigate the roleof optimization in the generation of emotional body language dur-ing walking. Our long-term goal is to use inverse optimal controltechniques as used in Mombaur et al. [13], in order to identify theobjective functions underlying different emotions. This approachrequires hypothesises for possible base functions for the objectivefunctions. A list of proposed base functions is given in Table 1. Thetable lists both the criteria and in which way (minimization or max-imization) the criteria may contribute to the underlying objectivefunction.

For examples during expressions of both anger and joy weassume that there is a strong consumption (hence a maximization)of energy whereas for sadness the emotional expression uses lit-tle effort (consequently minimization) that results in a saggingmotion. Similarly we assume that during anger there is a strong co-contraction (maximization) of the muscles that leads to stiff jointswhereas during expression of joy the body exploits inertial andgravitational forces that results in a very articulated motion that isnot inhibited by co-contraction (therefore minimization).

The table and the present paper presents preliminary studieson some of the base functions in order to evaluate their effect onwalking motions and to further improve the set of hypotheses fordifferent emotions.

3. Experiments

In 2010 we performed a case-study with 2 lay actors (one male,one female) at the Laboratoire de Physiologie de la Perception et del’Action (LPPA) Paris. Recordings of full-body walking motions weremade with a VICON motion capture system. We used a standardPlug-in-Gait model for the marker placements. The subjects wereasked to walk through the motion capture area, which is about 6steps in length. There were three different modes of recordings:neutral, passive emotion, and acted emotion. In neutral the subjectswalked in their comfortable speeds without expressing any emo-tions. During the passive emotion recordings, the subjects wereasked to put themselves into a given emotional state outside ofthe capture region and to start walking once they did. By doing sothe captured motion was already at the final speed. For the acted

emotion the subjects were asked to feel the emotion similarly as inpassive emotion but this time to act out the emotion as if on stage.

Both the order, emotional mode, and emotions were random-ized to keep actors concentrated. We recorded for each subject: 3

M.L. Felis et al. / Journal of Computational Science 4 (2013) 255–261 257

F a sings

na

dtla

4

vaaiittm

iAdcftZmrb

afstg

M

mst

ig. 1. Overview of the walk cycle and our human model. (a) Human walk cycle of

imulation.

eutral motions and for each emotion 6 motions (3 passive emotionnd 3 active emotion). This gives a total of 42 recorded motions.

From this set of motions we chose two that we use as inputata for a motion reconstruction using a dynamical model. We usedhe motions acted sadness and acted anger as they represent theower and upper range in terms of expressiveness and speed in therticulation.

. Dynamics of human walking

We use a model-based approach for our research similar to pre-ious efforts such as Schultz and Mombaur [19] and Koschorrecknd Mombaur [9]. For us the term model is primarily associated with

physical rigid-body model of the human locomotor system thats subject to external forces like gravity, ground reaction forces andnternal forces (torques in the joints). The purpose of our model iso enable us to test various hypothesises that we pose on emotionshat are listed in Table 1. In this section we describe the rigid-body

odel that is used for the simulation.The multi-body model we created consists of 12 bodies, includ-

ng legs, upper body, arms, head and a three-segmented trunk.n overview of the model configuration and its resulting ndof = 31egrees of freedom can be found in Fig. 1(b). We use a right-handedoordinate system in which, as seen from the model, X pointsorward (saggital direction), Y up (longitudonal direction), and Zowards the left (transversal direction). Rotations are described byYX-Euler angles. The dynamical parameters for the model (seg-ent geometry, masses, inertia) were taken from de Leva [11] to

epresent an adult human. The posture of the model is describedy using minimal coordinates q(t) ∈ R

ndof .The motion of the model is subject to rigid-body dynamics which

llow us to not only use objective functions limited to kinematiceatures like angles and postures, instead we can formulate criteriauch as minimize energy, maximize ground impact and the like. Usinghe dynamics also has the advantage that the motions can exploitravitational forces similar to real humans.

The dynamics of such a model can be described as:

(q)q = � − N(q, q) (1)

The ndof × ndof matrix M(q) is the so-called joint-space inertiaatrix and is symmetric and positive definite. On the right-hand

ide we have the applied torques �, the vector containing gravi-ational and non-linear forces (e.g. coriolis forces) N(q, q). Solving

le step with its phases. (b) Overview of the rigid-body model used in the dynamics

this system gives us the joint accelerations q that are numericallyintegrated twice to compute the forward simulation of q(t).

The linear system in Eq. (1) has to be extended to model groundcontact, which is always existing by at least one leg during walk-ing. In the present model version we use point feet, which meansthere is no distinction between initial heel, flat foot, and toe contact.Instead, for each foot, there is only a single point of contact at pos-itions RightContact and LeftContact at the distal ends of the lower legsegments (see Fig. 1(b)). We assume nonsliding ground contact andtherefore use algebraic equations to fix the translation of the feetwhen in contact with the ground. This results in a system of differ-ential algebraic equations (DAE) with differential index 3. Since thesingle- and double-support phases are described by different alge-braic equations we have two distinct DAE model equations. Theseequations are then reformulated to index 1 systems that can besolved with standard ordinary differential equation (ODE) methods,each time solving the underlying system of algebraic constraints of[16]:

(M(q) G(q)T

G(q) 0

)(q

−�

)=(

� − N(q, q)

−�(q, q)

). (2)

G(q) is the Jacobian of the algebraic constraint and �(q, q) con-tains second derivatives of the algebraic constraint. Solving thissystem provides us with the joint accelerations q and the groundreaction forces � of the added constraints. Source code for quanti-ties M(q), G(q), N(q, q), �(q, q) were established using the HuMAnSToolbox from Wieber et al. [21].

The transition from single-support to double-support occurswhen the Y-value of the right foot point is 0 and the foot point movesalong the negative Y-axis. The collision of the foot point with theground is assumed to be instantaneous and fully inelastic. The suc-cessive double-support phase ends when the vertical componentof the ground-reaction force acting on the left foot vanishes.

We do not include muscles in the model, but instead use torquesto power all internal DOF. In addition, we insert linear springdamper elements to model the compliance and damping proper-ties of muscles, ligaments and passive tissues. The parameters of

these spring-damper elements are also left free for optimization.

Data for both kinematic and dynamic properties are taken fromthe literature. We used tabular data from de Leva [11] which alsoallows us to customize the model in height and weight to match

258 M.L. Felis et al. / Journal of Computational Science 4 (2013) 255–261

F ontrolv

to

5d

dwdqT

Fg

ig. 2. Visualizations of the reconstructed gaits from kinematic data and optimal cirtual markers added to the human model.

hat of the recorded subject. All these properties are parameters inur rigid-body model.

. Using optimal control to approximate measurementata by a dynamical model

The data from our experiments only contain kinematic markerata and therefore does not allow any analysis of the dynamics

ithin the body. Using optimal control methods we can use theata from the motion capture as input and reconstruct dynamicsuantities such as joint torques for the previously described model.o do so we formulated an optimal control problem which uses

ig. 3. Visualizations of the gaits generated by forward optimization. (a) Generated gait byait by minimizing time.

methods. The black markers are the recorded data and the white markers are the

a least-squares objective function which tries to approximate themotion of our model to a recorded motion.

Optimal control methods simultaneously optimize state tra-jectories x(t) and controls u(t). Here x(t) = [q(t), q(t)]T ∈ R

2nndof

with nndof = 31 are the positions and velocities of the generalizedstate variables. The functions u(t) ∈ R

nu and nu = 25 are the torquesapplied at the actuated joints.

Additionally, this method allows optimization of unknownmodel parameters, such as spring damper parameters, step

length and velocity, that are all contained in the vector p.As we have two model equations (one for single- and onefor double-support) we have a two-phase optimal controlproblem.

minimizing joint torques, head movement and touch-down impact. (b) Generated

M.L. Felis et al. / Journal of Computational Science 4 (2013) 255–261 259

Fig. 4. Reconstructed trajectories for generalized states q(t) normalized with respect to time for the motions acting angry and acting sad. The dashed vertical line marks thet ots ins

i

x

s

x

x

g

r

r

ouch-down of the swing leg and change from single- to double-support phase. Pladness.

The complete optimal control problem can be written for phases = 1, 2 as:

min( · ),u( · ),p,ti

12

||MMoCap(t) − MSim(x(t))||22 (3a)

ubject to:

˙ (t) = fi(t, x(t), u(t), p) (3b)

(t+i+1) = hi(x(t−

i+1), p), (3c)

i(t, x(t), u(t), p) ≥ 0, (3d)

eq(x(0), . . . , x(tf ), p) = 0, (3e)

ineq(x(0), . . . , x(tf ), p) ≥ 0. (3f)

lighter colors are from the acted anger motion, the darker trajectories from acted

The function MMoCap(t) describes the positions of the markers ata given time t. We added virtual markers to our human model at theapproximate segment positions, which are described by MSim(x(t)).For the reconstruction markers at the head, shoulders, elbow, handsand feet were used.

Single- and double-support phases are described by Eq. (3b)as distinct model equations and the transition between them ismodeled by the phase transition function Eq. (3c). The touch-downis handled instantaneously by calculating and using an impulseapplied at the contact point, which is then propagated through themulti-body system in function Eq. (3c). The variable t−

i+1 is the point

in time right before the touch-down and t+

i+1 right after the per-fectly inelastic collision of the foot with the ground. General stateand control boundaries such as joint and torque limits are describedby Eq. (3d). Posture conditions (e.g. foot positions), periodicity at

2 putati

g(

6

ctMigt

x

s

x

x

g

r

r

tmft

a

wa�Ttt

gto

6

ssif

ctiwTa

60 M.L. Felis et al. / Journal of Com

iven time points and phase switches are modeled by Eqs. (3e) and3f).

. Optimizing walking motions using different criteria

Instead of trying to reconstruct pre-recorded motions we alsoomputed walking motions without any input from motion cap-ure data, similar to the running motions generated in Schultz and

ombaur [19]. The idea is to replace the least-squares functionaln Eq. (3a) with an appropriate objective function that generates aait with a certain emotion. A general optimal control problem ishen formulated as:

min( · ),u( · ),p,ti

∫ tf

0

�L(t, x(t), u(t), p)dt + �M(tf ) (4a)

ubject to:

˙ (t) = fi(t, x(t), u(t), p) (4b)

(t+i+1) = hi(x(t−

i+1), p), (4c)

i(t, x(t), u(t), p) ≥ 0, (4d)

eq(x(0), . . . , x(tf ), p) = 0, (4e)

ineq(x(0), . . . , x(tf ), p) ≥ 0. (4f)

The term �L is called the Lagrange-term and �M the Mayer-erm in the objective function Eq. (4a). The two terms are

athematically equivalent but their explicit use is helpful for theormulation of the objective function. Together they form a Bolza-ype optimal control problem [2].

We generated neutral looking human walking motions by using Lagrangian term of the form:

L(x(t), u(t), p) = cu||Wu(t)||22 + ch||vhead(t)||22 (5)

hich minimizes both the torques u(t) applied to the systemnd the motion of the head vhead(t). Additionally a Mayer termM(t1) = cm||�ptd(t1)||22 minimizes the impulse at the touch-down.

he weight matrix W as well as the constants cu, ch, and cm are usedo scale the objective function components, taking into accountheir different dimensions and different strengths of joint actuators.

As a first step towards the forward optimization of emotionalaits we also generated a walking motion that uses the optimiza-ion criteria minimum time. This is achieved by using a Mayer-termbjective function as:

M(tf , x(tf )) = tf (6)

No Lagrange-term is used for this optimization.

.1. Results of the optimizations

Image series for the reconstructed motions acted anger and actedadness can be found in Fig. 2. Plots of the trajectories for the recon-truction of motions acted anger and acted sadness can be foundn Fig. 4. A video that shows both the reconstructed motions andorward optimizations can be found online.1

Overall the reconstruction of acted sadness has a better qualityompared to acted anger. This is mainly due to the limitations ofhe modeling of the feet. For the faster acted anger motion theres substantial rolling over the toes that results in a lifting of the

hole body, which is not the case for the acted sadness motion.his phenomenon is not captured by our simple point foot modelnd thus results in a bad matching of the upper body markers.

1 http://bit.ly/SXjyn6.

onal Science 4 (2013) 255–261

Visualizations of the gaits generated by forward optimizationcan be found in Fig. 3. The minimum time optimization resulted in avery hectic motion not unlike an enraged person. Also, the motionlooks like that of a female person, due to a strong lateral hip motion.

7. Conclusion and outlook

We have investigated reconstruction of pre-recorded motioncapture data by using optimal control methods. Furthermore wegenerated optimal human gaits for different objective functions.The results of the reconstructions showed that a more realistic footmodel is required. This is currently in progress and should allowbetter approximations of the recorded motions. The forward opti-mization allows us to test different objective function that resultsin gaits of different style. Already using a minimum time objectivefunctions adds an emotional aspect to the motion.

The results of this paper build the foundations of future worksto perform inverse optimal control to identify the specific objectivecriteria for emotional gaits. As base functions for the objective func-tion we want to use the criteria defined in Table 1. This results ina hierarchical optimization problem, which identifies parameters(coefficients of a linear combination of base objective functions)for an optimal control model (the forward optimization of in thispaper). We plan to solve this with the proposed bilevel approachas described in Mombaur et al. [13].

Acknowledgements

The authors gratefully acknowledge the financial support andthe inspiring environment provided by the Heidelberg GraduateSchool of Mathematical and Computational Methods for the Sci-ences, funded by DFG (Deutsche Forschungsgemeinschaft).

Appendix A. Numerical treatment of optimal controlproblems

The multi-phase optimal control problems (3) and (4) weresolved by using the software package MUSCOD-II [10] which usesa direct multiple-shooting method. It solves the optimal controlproblem directly, which means that the controls in the continuousformulation Eqs. (3a)–(3f) are approximated by a finite dimensionaldiscretization. In our case the time-horizon [t0, tf] was divided intoN equidistant intervals with:

t0 < t1 < . . . < tN−1 < tN = tf . (A.1)

On each interval Ij = [tj−1, tj] we define finite dimensional basefunctions ϕ(t, wj) with parameters wj ∈ R

nw such that the controlscan be written as: u(t) = ϕ(t, wj) for all Ij. Thus, the controls aresolely discretized by wj with j = 0, . . ., N − 1. The dimension of thecontrol parameters nw depends on the type of base function, e.g. forpiecewise constant functions nw would be nu, for piecewise linearbase functions we would have: nw = 2nu.

For the state parametrization it uses a multiple shootingmethod. Similar as in the previous paragraph the time horizon getssplit up into M equidistant so-called multiple shooting intervals forwhich the interval boundaries are called multiple shooting nodes. Oneach node k at time tk the value of the state is described by sk = x(tk),k = 0, . . ., M. With this we can define M initial value problems

x(t) = fi(t, x(t), u(t), p), t ∈ [tk, tk+1]

x(t ) = s

k k

for k = 0, . . ., M − 1, of which the numerical solutions are denotedby x(t ; sk). By adding continuity conditions x(tk+1 ; sk) − sk+1 = 0, theoriginal boundary value problem is then replaced by M initial value

putati

pd

c

y

wmFtm

m

s

g

h

wp

stsw

tnii

A

t

R

[

[

[

[

[

[

[

[

[

[

[

[

neuroscience. He is an author of more than 250 papers

M.L. Felis et al. / Journal of Com

roblems together with continuity conditions. This allows us toiscretize the trajectories of the states with sk for k = 0, . . ., M.

Having now parameterized both u(t) and x(t) by wj and sk wean now define a vector

= [w0, . . . , wN−1, s0, s1, . . . , sM, p, ti]T (A.2)

hich contains discretized values together with the other opti-ization variables of the continuous optimal control problem.

urthermore, by imposing the constraints in Eqs. (3d)–(3f) only onhe multiple shooting nodes, we can formulate a nonlinear opti-

ization problem over the variables in the vector y:

iny

F(y), (A.3)

ubject to:

(y) ≥ 0, (A.4)

(y) = 0, (A.5)

hich is the discretized version of our original optimal controlroblem.

This problem could then be solved by using a general purposeequential quadratic programming (SQP) solver. However, due tohe fact that the values wj and sk only have local influence, we havepecially structured matrices in the underlying quadratic problemshich are heavily exploited by MUSCOD-II.

In our problem we used N = M = 20 and piecewise linear func-ions as base functions for the control discretization. The discretizedonlinear optimization problem has 2450 variables, 1842 equal-

ty and 4951 inequality constraints. The dynamics of our model isntegrated with an integrator tolerance of 10−5.

ppendix B. Supplementary data

Supplementary data associated with this article can be found, inhe online version, at http://dx.doi.org/10.1016/j.jocs.2012.10.001.

eferences

[1] A.P. Atkinson, W.H. Dittrich, A.J. Gemmell, A.W. Young, Emotion perceptionfrom dynamic and static body expressions in point-light and full-light displays,Perception 33 (2004) 717–746.

[2] G.A. Bliss, Lectures on the Calculus of Variations. Number 504 in Phoenix Books;504; Phoenix Science Series; Phoenix Books, Univ. of Chicago Press, Chicago,1976.

[3] S.H. Collins, P.G. Adamczyk, A.D. Kuo, Dynamic arm swinging in human walking,Proceedings of the Royal Society B: Biological Sciences 276 (2009) 3679–3688.

[4] A.R. Damasio, B.J. Everitt, D. Bishop, The somatic marker hypothesis and thepossible functions of the prefrontal cortex [and discussion], PhilosophicalTransactions of the Royal Society of London. Series B: Biological Sciences 351(1996) 1413–1420.

[5] C. Darwin, The Expression of the Emotions in Man and Animals, John Murray,London, 1872.

[6] P. Ekman, W. Friesen, Facial Action Coding System: A Technique for the Mea-surement of Facial Movement, Consulting Psychologists Press, Palo Alto, 1978.

[7] P. Ekman, W.V. Friesen, Constants across cultures in the face and emotion,Journal of Personality and Social Psychology 17 (1971) 124–129.

[8] B. de Gelder, Towards the neurobiology of emotional body language, NatureReview Neuroscience 7 (2006) 242–249.

[9] J. Koschorreck, K. Mombaur, Optimization of sommersaults and twists in plat-form diving, Computer Methods in Biomechanics and Biomedical Engineering12 (2009).

10] D. Leineweber, I. Bauer, A. Schäfer, H. Bock, J. Schlöder, An efficient multipleshooting based reduced SQP strategy for large-scale dynamic process optimiza-tion (Parts I and II), Computers and Chemical Engineering 27 (2003) 157–174.

11] P. de Leva, Adjustments to zatsiorsky-seluyanov’s segment inertia parameters,Journal of Biomechanics 29 (9) (1996) 1223–1230.

12] C.K. Liu, A. Hertzmann, Z. Popovic, Learning physics-based motion style with

nonlinear inverse optimization, ACM Transactions on Graphics 24 (2005)1071–1081.

13] K. Mombaur, A. Truong, J.P. Laumond, From human to humanoid locomotion –an inverse optimal control approach, Autonomous Robots 28 (2010) 369–383,10.1007/s10514-009-9170-7.

onal Science 4 (2013) 255–261 261

14] L. Omlor, M.A. Giese, Extraction of spatio-temporal primitives of emotionalbody expressions, Neurocomput 70 (2007) 1938–1942.

15] A. Ortony, T.J. Turner, What’s basic about basic emotions, Psychological Review97 (1990) 315–331.

16] F. Pfeiffer, C. Glocker, Multibody Dynamics with Unilateral Contacts, WileySeries in Nonlinear Science, Wiley, New York, 1996 [u.a.]. Includes bibliograph-ical references (p307-314) and index.

17] R. Plutchik, The Emotions: Facts, Theories, and a New Model, Random House,New York, 1962.

18] C.L. Roether, L. Omlor, A. Christensen, M.A. Giese, Critical Featuresfor the Perception of Emotion from Gait, Journal of Vision 9 (2009)http://www.journalofvision.org/content/9/6/15.full.pdf+html

19] G. Schultz, K. Mombaur, Modeling and optimal control of human-like running, IEEE-ASME Transactions on Mechatronics 15 (2010)783–792.

20] N.F. Troje, Decomposing biological motion: a framework for analy-sis and synthesis of human gait patterns, Journal of Vision 2 (2002)http://www.journalofvision.org/content/2/5/2.full.pdf+html

21] Wieber, P.B., Billet, F., Boissieux, L., Pissard-Gibollet, R., 2006. The HuMAnS tool-box, a homogeneous framework for motion capture, analysis and simulation.Technical Report. INRIA Rhône-Alpes.

Martin L. Felis received his diploma in mathematics withspecialization in optimal control from the University ofHeidelberg in 2009. He has been since then a PhD stu-dent of the Heidelberg Graduate School of Mathematicaland Computational Methods for the Sciences and a mem-ber of the research group Optimization in Robotics andBiomechanics. His research interests include, multi-bodydynamics, screw theory, animation, and optimal-controlmethods.

Katja Mombaur is professor at the Interdisciplinary Cen-ter of Scientific Computing (IWR) at the University ofHeidelberg since 2010, and leads the research group onOptimization in Robotics & Biomechanics as well as theIWR Robotics Lab. She also holds an associate researcherstatus at LAAS-CNRS Toulouse, where she spent twoyears as a visiting researcher in 2008–2010. She stud-ied Aerospace Engineering at the University of Stuttgart,Germany, and the ENSAE in Toulouse, France, and got herDiploma in 1995. For the next two years she worked at IBMGermany. She received her PhD degree in Applied Math-ematics from the University of Heidelberg, Germany, in2001. In 2002, she was a postdoctoral researcher at Seoul

National University, South Korea. From 2003 to 2008, she worked as a lecturer andresearcher at IWR, University of Heidelberg. Her research interests include model-ing of complex mechanical systems and cognitive processes in biomechanics androbotics, as well as optimization and control techniques for fast motions.

Hideki Kadone received his diploma in informationscience and technology with specialization in mechano-informatics from the University of Tokyo in 2008. He wassince then a postdoctoral researcher at the Laboratory ofPhysiology of Perception and Action at College de Franceand is now a researcher at Center for Cybernics Researchin University of Tsukuba. His research interests include,emotion and gaze in locomotion and its application torobotic systems.

Alain Berthoz is currently honorary professor at theCollege de France, member of the French Academy ofSciences and Academy of Technology, the AcademiaEuropae, American Academy of Arts and Sciences andother Academies (Belgium, Bulgaria). He was the founderand director of the Laboratory of Physiology of Perceptionand Action of CNRS/College de France. He is an engineer,psychologist and neurophysiologist and has made his car-reer at the CNRS as a scientist in the field of cognitive

in international journals and the author of several books,such as The Brain’s Sense of Movement (Harvard Univ Press),Emotion and Reason (Oxford Univ. Press), Principles of Sim-

plexity (Yale Univ. Press; 2011).