modeling expressive character motion for narrative and...
TRANSCRIPT
-
Modeling Expressive Character Motion for
Narrative and Ambient Intelligence
Based on Emotion and Personality by
Wen-Poh Su
MS, MFA, Adv.Grad.Cert.IMD, BA (1st Hons)
A Dissertation
Submitted in Fulfill of the Requirements for the Degree of
Doctor of Philosophy
School of Software Engineering and Data Communication
Faculty of Information Technology
Queensland University of Technology
Brisbane, Australia
December 2007
-
Keywords
Animated agent modeling, narrative intelligence, ambient intelligence, fuzzy logic,
human behaviour modeling, personality, emotion, animation, cartoon, character
appearance
-
Abstract
Animated agent technology has been rapidly developed to provide ubiquitously
psychological and functional benefits for fulfilling communicative goals. However,
the character motions of most character-centered models based on pre-stored
movement, finite state machine and scripted conditional logic are generally restrictive.
The major drawback lies in the lack of maturity of integrating the elements between
personality, emotion and behaviour. To bridge the gap between cognitive and
behavioural elements, we examine the connections between human personality,
emotion, movement and cartoon modeling for the agent design. Human personality
and emotional behaviour are the essences in the recognition of a believable synthetic
character. Personality and emotion come from the storylines and result in characters
motions. Cartoon animations successfully engage the audience and create emotional
connections with the spectators. However, even a sophisticated animator often faces
some difficulties while performing a very laborious task to simulate an emotion- and
personality-rich character.
This thesis focuses on exploring effective techniques to extract personality and
emotion features for a high-level control of character movements. A hierarchical
fuzzy rule-based system was constructed, in which personality and emotion were
mapped into the bodys movement zones of a character. This facilitates agent
designers to control the personality and emotion of a dynamic synthetic character.
The system was then applied to a Narrative Intelligent system and extended to an
Ambient Intelligent environment. An innovative storyboard-structured storytelling
method was devised by using story scripts and action descriptions in a form similar to
the content description of storyboards to predict specific personality and emotion. As
software or device agents evolve into the Ambient Intelligence, new concepts for
effective agent presentations and delegating control are necessary to minimise the
humans tasks and interventions in the complex and dynamic environment. A novel
customizable personalised agent framework was developed by utilising the spirit of
-
cartoon animation to match each users profile in the form of a cartoon reciprocal
agent. As a result, users could explicitly modify personality and emotion values to
change the psychology traits of the agent, which would affect their appearance and
behaviour through body posture expression.
An evaluation of the system was conducted to verify the effectiveness and the
applicability in both Narrative and Ambient intelligent agent frameworks. The
significance of this research is that applying higher cognitive factors to animated
characters can lead to a better animation design tool and reduce strenuous animation
production efforts in agent designs. It will also enable animated characters to
embody more adaptive, flexible and stylised performance.
-
Associated Publications
Journal paper
Su, W., B. Pham, and A. Wardhani, "Personality and Emotion-based High-
level Control of Affective Story Characters," IEEE Transactions on Visualization
and Computer Graphics (IEEE TVCG), March/April, vol. 13 (2), pp. 281-293,
2007.
Two Accepted Conference Papers
Su, W., B. Pham, and A. Wardhani. Generating Believable Personality-Rich
Story Characters Using Body Language, In Proceedings of the Fourth
International Conference on Active Media Technology (AMT06), 7-9, June 2006.
Y. Li, M. Looi and N. Zhong, Eds. Brisbane, AU: IOS Press, "Advances in
Intelligent IT: Active Media Technology 2006", pp.132-137.
Su, W., B. Pham, and A. Wardhani., High-level Control Posture of Story
Characters Based on Personality and Emotion, In Proceedings of the Second
Australian Interactive Entertainment Conference (IE05), 23-25, Nov, 2005, Y.
Pisan, Eds. Sydney, ACM Digital Library, pp.179-186
-
i
Table of Contents
CHAPTER 1 INTRODUCTION..................................................................... 1
1.1 Background and Motivation ...................................................................................................2
1.2 Aims of Research ...................................................................................................................4
1.3 Research Questions ................................................................................................................5
1.4 Significance and Potential Applications.................................................................................7
1.5 Thesis Outline.........................................................................................................................9
CHAPTER 2 ANIMATED AGENTS........................................................... 11
2.1 Animated Agent Representation...........................................................................................11 2.1.1 Agent Functionality and Capability ............................................................................12 2.1.2 Agent Modality ............................................................................................................12
2.2 Agent Design in Narrative Intelligence ................................................................................16 2.2.1 Narrative Intelligence...................................................................................................16 2.2.2 Narrative Agent Design ................................................................................................19
2.3 Agent Design in Ambient Intelligence .................................................................................24 2.3.1 Ambient Intelligence .....................................................................................................24 2.3.2 Ambient Intelligent Agent Design .................................................................................27
2.4 Agent Presentation Methods.................................................................................................29
2.5 Techniques for Motion Control ............................................................................................30 2.5.1 Principles of Animation ................................................................................................31 2.5.2 Character Motion Techniques ......................................................................................35
2.6 Summary ..............................................................................................................................40
CHAPTER 3 EMOTIONAL BEHAVIOUR AND SOCIAL
INTERACTION .................................................................................................. 44
3.1 Personality ............................................................................................................................45 3.1.1 Psychological Research of Personality .......................................................................45 3.1.2 Personality Models......................................................................................................51
3.2 Emotion ................................................................................................................................58 3.2.1 Psychological Models of Emotion ...............................................................................59 3.2.2 Emotion Models...........................................................................................................63
-
ii
3.3 Nonverbal Behaviour............................................................................................................66 3.3.1 Facial Expression and Gesture ...................................................................................67 3.3.2 Posture ........................................................................................................................70 3.3.3 Visual Appearance ......................................................................................................74
3.4 Summary ..............................................................................................................................81
CHAPTER 4 FRAMEWORK OF ANIMATED AGENT.......................... 84
4.1 System Framework...............................................................................................................84
4.2 Parameters of the P & E Engine ...........................................................................................87 4.2.1 Personality Parameters ................................................................................................87 4.2.2 Personality Description Method...................................................................................90 4.2.3 Behaviour Parameters ................................................................................................100 4.2.4 Emotion Parameters ...................................................................................................101
4.3 Summary ............................................................................................................................102
CHAPTER 5 CHARACTER PERSONALITY AND EMOTION
ENGINE..103
5.1 Personality and Emotion Engine ........................................................................................103 5.1.1 Fuzzy Logic Controller for P & E Engine ..................................................................104 5.1.2 Mapping Postural Zones to Body Movement..............................................................117 5.1.3 Mapping the P & E Engine to the Animation and Graphic Engine............................118
5.2 P & E Engine for Walk Cycle Control ...............................................................................123 5.2.1 Example 1 ...................................................................................................................123 5.2.2 Example 2 ...................................................................................................................125
5.3 Summary ............................................................................................................................129
CHAPTER 6 THE P & E ENGINE FOR STORY CHARACTER ......... 131
6.1 Requirements for Acting Believability of Story Characters ...............................................133 6.1.1 Story Character Roles.................................................................................................133 6.1.2 Semantic Action Plan..................................................................................................134 6.1.3 The Driver of Story Characters Behaviours: Personality and Emotion....................135 6.1.4 Nonverbal Behaviour: Body Language ......................................................................136
6.2 P & E Engine for Story Character Control .........................................................................136 6.2.1 Mapping Personality to Story Role.............................................................................137 6.2.2 Story Input Module .....................................................................................................140 6.2.3 The Utilisation of the P & Emotion Engine ................................................................148 6.2.4 Mapping Body Languages to the Animation and Graphic Engine .............................151
-
iii
6.3 Analysis of Results .............................................................................................................152
6.4 Summary ............................................................................................................................154
CHAPTER 7 THE P & E ENGINE FOR PERSONALISED AMBIENT
INTELLIGENT AGENTS ............................................................................... 156
7.1 Modeling Personalised Agents ...........................................................................................157
7.2 Preliminary Survey.............................................................................................................161
7.3 The P & E Engine Used in the Personalised Agent System ...............................................162 7.3.1 User Profile Module ...................................................................................................163 7.3.2 Personalised Agent Module ........................................................................................171 7.3.3 Customisation Function..............................................................................................179 7.3.4 Mapping Personalised Agent Results to the Animation and Graphics Engine...........181
7.4 Analysis of Results .............................................................................................................184
7.5 Summary ............................................................................................................................185
CHAPTER 8 EVALUATION OF MODELING AUTONOMOUS
ANIMATED AGENTS..................................................................................... 188
8.1 Objectives of Evaluation ....................................................................................................188
8.2 Subjects, Apparatus and Procedures...................................................................................189
8.3 Evaluation of Autonomous Animated Characters ..............................................................194 8.3.1 Emotion.......................................................................................................................195 8.3.2 Personality..................................................................................................................197
8.4 Evaluation of the Usefulness of the P & E Engine .............................................................208
8.5 Evaluation of the Personalised AmI Agent ........................................................................211 8.5.1 User Preference ..........................................................................................................211 8.5.2 Representation of the Personalised AMI Agent Interface...........................................213 8.5.3 Customisation Function of the Personalised Agent Interface.....................................218
8.6 Preliminary Evaluation for Character Development...........................................................220
CHAPTER 9 CONCLUSION AND FUTURE WORK ............................ 222
9.1 Summary of Achievements ................................................................................................222
9.2 Limitations..........................................................................................................................227
9.3 Future Work .......................................................................................................................229 9.3.1 Improvements in Current Modules .............................................................................229 9.3.2 Utilising the P & E Engine in Potential Applications.................................................230
-
iv
APPENDICES. .............................................................................................. 236
APPENDIX A. FUZZY LOGIC CONTROL SYSTEM..........................................................236
APPENDIX B. AB5C MODEL ...............................................................................................265
APPENDIX C. SIX DEVISED SCENARIOS .........................................................................266
APPENDIX D. THIRTY-TWO PERSONALITY TYPES ......................................................272
APPENDIX E. USER STUDIES .............................................................................................289
REFERENCES ..313
-
v
List of Figures
Figure 2.1: Body and Mind of Character Capability ............................................ 13
Figure 2.2: Neuro Baby......................................................................................... 14
Figure 2.3: REA Project........................................................................................ 15
Figure 2.4: Virtual Theatre.................................................................................... 21
Figure 2.5: Virtual Babyz and Pets ....................................................................... 22
Figure 2.6: Body Chat Project............................................................................... 23
Figure 2.7: SAM Project ....................................................................................... 24
Figure 2.8: Squash and Stretch (Blair 1995)......................................................... 33
Figure 2.9: Follow-Through and Overlapping Action (Blair 1995) ..................... 34
Figure 2.10: The Process of Motion Capture ........................................................ 41
Figure 3.1: BEAT Project ..................................................................................... 69
Figure 3.2: Three Main Zones of Shaping (Lamb & Watson 1979)..................... 73
Figure 3.3: An Example of a Persons Inclination................................................ 74
Figure 3.4: Sheldons Somatotypes ...................................................................... 75
Figure 3.5: Cartoon Character Types (Blair 1995) ............................................... 79
Figure 3.6: Pugnacious and Cute Characters (Blair 1994).................................... 80
Figure 3.7: Screwball and Goofy Characters (Blair 1994) ................................... 81
Figure 4.1: A Schematic Diagram of Our Conceptual Model .............................. 85
Figure 4.2: The Utilisations of the P & E Engine in the Realms of Narrative
Intelligence and Ambient Intelligence .................................................................. 87
Figure 5.1: Overview of System Architecture .................................................... 104
Figure 5.2: Two Layered Fuzzy Controller of the P & E Engine ...................... 106
Figure 5.3: Personality Fuzzy Controller of the P & E Engine (Shaded Area) .. 107
Figure 5.4: OCEAN Membership Function........................................................ 108
Figure 5.5: Triangular versus Trapezoid Membership Function with Equal Base-
Widths ................................................................................................................. 108
Figure 5.6: Membership Functions for the Output BehaviourType .................... 108
Figure 5.7: Emotion Fuzzy Controller of the P & E Engine (Shaded Area)....... 113
Figure 5.8: Membership Function of Happy....................................................... 113
-
vi
Figure 5.9: Curved and Triangular Membership Functions with Equal Base-
Widths ................................................................................................................. 114
Figure 5.10: Membership Functions for the Horizontal Output ......................... 115
Figure 5.11: Basic Configuration of the Hierarchical Fuzzy Logical Control.... 117
Figure 5.12: The Control Panel of the P & E Engine ......................................... 119
Figure 5.13: The Devised Model in Maya Interface ........................................... 120
Figure 5.14: A Snapshot of MEL Expression Editor .......................................... 120
Figure 5.15: Hierarchical Joint Skeletons ........................................................... 121
Figure 5.16: A Snapshot of Motion Parameters in Properties Editor ................. 121
Figure 5.17: Happy Walking of Two Different Personality Characters ............. 124
Figure 5.18: Walking of Different Emotional States .......................................... 124
Figure 5.19: Personality Description of Type8: O+, C-, E+, A+, N-.................. 125
Figure 5.20: Example 2: The Comparison of Different Personality and Emotion
............................................................................................................................. 127
Figure 5.21: Personality Description of Type30: O+C-E+A-N+........................ 128
Figure 5.22: (a) Neutral walking (b) Angry walking (c) Sad walking (d)
Fearful walking ................................................................................................... 128
Figure 6.1: Overview of System Architecture .................................................... 137
Figure 6.2: Layout of the Story Module ............................................................. 143
Figure 6.3: A Snapshot of the Story Interface..................................................... 143
Figure 6.4: Hierarchical Fuzzy Controller of P & E Engine............................... 149
Figure 6.5: Membership Functions for the Output Character Type.................... 150
Figure 6.6: Animation and Graphics Engine Architecture.................................. 152
Figure 6.7: The Comparison of Two Different Scenarios .................................. 154
Figure 7.1: Overview of the Personalised Agent System ................................... 163
Figure 7.2: Interface of User Profile ................................................................... 165
Figure 7.3: Personalised Agent Module.............................................................. 172
Figure 7.4: The Interface of Personalised Agent ................................................ 174
Figure 7.5: Membership Function of Cute Type................................................. 177
Figure 7.6: Membership Function of Head Shape .............................................. 178
Figure 7.7: Membership Function of Head Size ................................................. 179
Figure 7.8: Reciprocal Connection between Human and Computer................... 180
Figure 7.9: The Animation and Graphics Engine ............................................... 182
Figure 7.10: Original Models.............................................................................. 182
-
vii
Figure 7.11: Some Examples of Agent Appearances ......................................... 182
Figure 7.12: Agents Appearance Control .......................................................... 183
Figure 7.13: The Comparison of Two Personalised Agents with Two Different
Personalities: (a) (O+C-E+A+N-) and (b) (O-C+E-A+N+)................................ 186
Figure 8.1: User Interface Layout P & E Engine ................................................ 190
Figure 8.2: Screenshot of User Interface during the Effectiveness Interaction .. 191
Figure 8.3: Sample Screenshot of User Interface during the Believability
Interaction of Question 6 to Question 14 ............................................................ 191
Figure 8.4: The Visualisation Platform during the Demonstration..................... 192
Figure 8.5: The Age of Participants .................................................................... 192
Figure 8.6: Academic Levels in Animation Degree of Participants ................... 193
Figure 8.7: The Most Difficult Process in Animation Production ...................... 193
Figure 8.8: Happy and Sad.................................................................................. 196
Figure 8.9: Angry and Fearful............................................................................. 196
Figure 8.10: Sad and Disgusted .......................................................................... 196
Figure 8.11: A Snapshot of the Sad Variance..................................................... 199
Figure 8.12: A Snapshot of the Surprised Variance............................................ 199
Figure 8.13: A Snapshot of the Disgusted Variance ........................................... 200
Figure 8.14: A Snapshot of the Angry Variance................................................. 201
Figure 8.15: A Snapshot of the Happy Variance ................................................ 201
Figure 8.16: A Snapshot of the Fearful Variance ............................................... 202
Figure 8.17: Story G............................................................................................ 207
Figure 8.18: Story H............................................................................................ 207
Figure 8.19: Clip G and Clip H........................................................................... 208
Figure 8.20: A Screenshot of the Personality Test Interface .............................. 214
Figure 8.21: A Screenshot of Personalised Agent Interface ............................... 214
Figure 8.22: Type 3: O+C+E+A+N-; Cartoon type: Cute-Normal..................... 216
Figure 8.23: Type 2: O+C+E+A+N+; Cartoon type: Cute ................................. 216
Figure 8.24: Type 22: O+C-E+A+N+; Cartoon type: Screwball........................ 217
Figure 8.25: Type 28: O-C-E+A-N-; Cartoon type: Pugnacious ........................ 217
Figure 8.26: Type 10 O-C+E-A+N+; Cartoon type: Goofy................................ 217
-
viii
Lists of Tables
Table 2.1: New Principles of Animation (Kerlow 2004)...................................... 32
Table 2.2: Principles of Animation and Related Physical Parameters.................. 36
Table 3.1: NEO-PI Facets (John & Srivastava 1999) ........................................... 48
Table 3.2: Basic Characteristics of the FFM Model (McCrae &a Costa 1987).... 49
Table 3.3: A Part of The Abridged Big Five Circumplex Model (De Raad 2000)51
Table 3.4: The Comparison of Trait Theories....................................................... 52
Table 3.5: The Level of Personality Descriptors and Corresponding Personality
Types (Howard and Howard, 1995)...................................................................... 55
Table 3.6: Dimensions of Personality (Rousseau 1996) ....................................... 57
Table 3.7: Rosemans Model ................................................................................ 61
Table 3.8: Chosen Action in Guye-Vuillemes Multi-User Virtual Environment
Research, Classified by Posture/Gesture and Part of the Body (Guye-Vuilleme et
al. 1998) ................................................................................................................ 70
Table 3.9: Sheldons Three Main Somatotypes .................................................... 76
Table 3.10: Self-Description Test (Knapp 1980).................................................. 78
Table 3.11: Adjectives for Self-Description Test (Knapp 1980) .......................... 79
Table 3.12: Comparison of Sheldons Somatotypes and Cartoon Types ............. 82
Table 4.1: Parts of the AB5C Model (De Raad 2000) .......................................... 89
Table 4.2: Our Evaluated Personality Combinations of the Perceiving Process .. 92
Table 4.3: Our Evaluated Personality Combinations of the Reasoning Process... 93
Table 4.4: Our Evaluated Personality Combinations of the Learning Process ..... 94
Table 4.5: Our Evaluated Personality Combinations of the Acting Process......... 95
Table 4.6: Our Evaluated Personality Combinations of the Deciding Process..... 96
Table 4.7: Our Evaluated Personality Combinations of the Interacting Process .. 97
Table 4.8: Our Evaluated Personality Combinations of the Revealing Process ... 98
Table 4.9: Our Evaluated Personality Combinations of the Feeling Emotions
Process .................................................................................................................. 99
Table 4.10: Our Behaviour Parameters............................................................... 101
Table 5.1: The Result of the 32 Personality Types ............................................. 112
Table 5.2: Output Variables of Emotion FLC..................................................... 113
-
ix
Table 5.3: The Relationship of Behaviour and Kinesphere Zones ..................... 115
Table 5.4: Emotions and Four Main Areas of Body Movement......................... 118
Table 5.5: Emotions and Physical Parameters .................................................... 118
Table 5.6: Motion Parameters Related to Three Kinesphere Zones and Timing 123
Table 6.1: Basic Scenario Structure .................................................................... 141
Table 6.2: Proposed Scenario 4........................................................................... 142
Table 6.3: Options of Scene 1 and 3 for Users ................................................... 145
Table 6.4: Postures, Actions and Movement of Body Parts related to the Meaning
of Body Language............................................................................................... 147
Table 6.5: Examples of Gestures ........................................................................ 148
Table 6.6: Some Examples of Body Language and Four Main Areas of Body
Movement ........................................................................................................... 151
Table 6.7: The Comparison of Scene 3............................................................... 153
Table 7.1: Personality Test (Pervin, Cervone & John 2005) .............................. 164
Table 7.2: Characteristic Personality Pattern (Howard & Howard 1995) .......... 166
Table 7.3: The Five Behavioural Styles.............................................................. 167
Table 7.4: A Descriptive Comparison of Ectomorphic Type (Sheldon 1940) and
Goofy Type (Blair 1994)..................................................................................... 175
Table 7.5: Cartoon Appearance Rules ................................................................ 178
Table 7.6: The Result of Cartoon Appearance.................................................... 184
Table 8.1: The Emotional Evaluation ................................................................. 195
Table 8.2: The Emotional Evaluation ................................................................. 198
Table 8.3: Story G............................................................................................... 204
Table 8.4: Story H............................................................................................... 205
Table 8.5: The Personality Evaluation ................................................................ 205
Table 8.6: The Suitability Evaluation ................................................................. 206
Table 8.7: Z-score Distribution of Q16 and Q17 ................................................ 206
Table 8.8: The Usefulness of the P & E Engine ................................................. 209
Table 8.9: Z-score Distribution of Q 18 to Q25................................................. 210
Table 8.10: The User Preference Evaluation ...................................................... 212
Table 8.11: Z-score Distribution of Q26............................................................. 212
Table 8.12: The User Preference Evaluation ...................................................... 212
Table 8.13: The Result of Participants Personality Test.................................... 215
Table 8.14: The Results of Subjects Personalised Agent .................................. 216
-
x
Table 8.15: The Effectiveness of the Personalised Agent Framework ............... 218
Table 8.16: Z-score Distribution of Q31 to Q33................................................. 218
Table 8.17: The Effectiveness of the Customisation Function ........................... 219
Table 8.18: Z-score Distribution of Q34 to Q36................................................. 219
Table 8.19: The Preliminary Evaluation for Future Development ..................... 221
Table 8.20: Z-score Distribution of Q37 to Q41................................................. 221
-
xi
Abbreviations
2D Two-dimensional
3D Three-dimensional
AB5C Abridged Big Five Circumplex
AI Artificial Intelligence
AMI Ambient Intelligent
COG Center of Gravity
COA Center of Average
COS Centre of Sums
ECA Emotion Conversational Agent
FFM Five-Factor Model
HCI Human Computer Interaction
LOM Largest of Maximum
LMA Laban Movement Analysis
Mocap Motion Capture
MOM Mean of Maximum
NEO-PI NEO Personality Inventory
NI Narrative Intelligence
OCEAN Openness, Conscientiousness, Extraversion, Agreeableness and
Neuroticism
OCC Ortony, Clore, and Collins
SRCT Social Reactions to Communication Technology
SOM Smallest of Maximum
-
xii
Statement of Original Authorship
The work contained in this thesis has not been previously submitted for a degree
or diploma at this or any other higher education institution. To the best of my
knowledge and belief, the thesis contains no material previously published or
written by another person except where due reference is made.
Signed: Date:
-
xiii
Acknowledgements
Many wonderful people helped me with the research for this thesis who deserve
my sincerest appreciation. First thanks go to my principle supervisor, Professor
Binh Pham, for her enormous insight, extra patience and steady guidance which
have ensured the completion of this thesis. I also would like to thank my
associate supervisors, Dr. Aster Wardhani for the first two-year guidance and Dr.
Dian Tjondronegoro for his support and constructive advice at the last year of my
PhD journey.
Thanks to Queensland University of Technology which has provided me nice
working environment with all the facilities, resource, travel allowances and
services. Thanks to all the staff who have helped me with many things.
Thanks to Professor Tao-I Hsu for his support of my system evaluation. Thanks
to students and colleagues of Department of Digital Multimedia Arts in Shih Hsin
University who participated the evaluation.
Thanks to Dr Rose Brown for the constructive criticism as a panel member and
my friends, Mimi, Darren, Miriam, Sam and Charles for their encouragement,
advice and sharing their PhD experience. Thanks to my old colleagues, Ming,
Nicole and Jessica for your belief in me.
My foremost gratitude goes to my dear husband, Kuang-Yuan who always
believes in me, and supports us financially that allowed me to concentrate on my
study. I would also like to thank my loving sister, my lovable son and daughter
for your love, enthusiasm and inspiration which give meanings to this PhD
journey. Finally, I would like to express my deepest gratitude to my mum and
dad for your endless love and encouragement.
-
1
CHAPTER 1 INTRODUCTION
Animated agent technology has been rapidly developed, and subsequently a trend
in industry and academic sectors has emerged whereby animated agent technology
is applied to support users in learning, entertaining, and working. As computer
infrastructure is getting faster, more compact and more affordable universally, it is
desirable to utilise an animated agent in the handheld devices via mobile, PDA,
internet and omnipresent technologies to ubiquitously provide psychological and
functional benefits to fulfill communicative goals. Artificial Intelligence (AI) has
focused on multiagent interactions for problem solving to support human tasks.
Some researchers have devised intelligent text-based agents to improve the
human-computer relationship. Many AI tools are designed to create intelligent
behaviour or model human cognition. However, AI has not been traditionally
concerned with modeling expressive agent motions that can embody distinctive
personalities and engage in nonverbal communication. Some agent-related
research focuses on creating a believable agent. However, creating character
believability involves not only creating intelligence or realism but also providing
affective connection between human and agents. This research is inspired by the
artistic nature of modeling animated characters. Believable agents that can
display personality and emotion-rich nonverbal behaviours and engage in social
interactions are critical to a sophisticated agent design. In this research, the aim is
to create a new tool to provide agent designers some artistic freedom.
In the past decade, the task of creating lively animated characters was labour
intensive in agent-related research, film and game industries. The design of a
character involves the visual interpretation of a story and the type of emotion it
contains. Animators translate the personality of characters into facial animations,
gestures and motions. To date, even a highly trained animator often faces some
difficulties to simulate an emotion and personality-rich character. Therefore, the
tools that have been designed for film or game purposes are not sufficient for the
agent design. Hence, new tools and methodologies for modeling autonomous
-
2
animated agents based on personality and emotion control are needed to support
agent designers emulating animated agents to enhance the believability of agent
presentation.
In this opening chapter, the overall background, motivation, major research
questions, and aims of the research will be introduced. The significance of the
chosen approaches and possible applications is identified. The thesis outline is
summarised at the end of the chapter.
1.1 Background and Motivation The research motivation regarding modeling expressive agent motions derives
from four domains of agent research analysis: Narrative Intelligence (NI), Human
Motion, Animation, and Ambient Intelligence (AmI) research.
Narrative Intelligence Research
Narrative Intelligence encompasses areassuch as virtual theatre, interactive
drama or immersive storytellingthat provide a variety of nonlinear storytelling
techniques. These interactive storytelling systems allow users to make different
requests and follow a variety of possible sequences that have multiple beginnings
and endings. Character expression is the essence of believability in a character-
centric storytelling system. Personality and emotion trigger the expressiveness
and behaviours of a character being conveyed on the stage. The temperament of a
character represents the mind and cognition of the synthetic entity. Researchers
have endeavoured to enhance the believability of autonomous agents. Some
emotion-related motion models have been constructed. However, the character
motions of most character-centred story models based on pre-stored movement,
finite state machine and scripted conditional logic are generally restrictive. The
major drawback lies in the lack of maturity in the integration of reasonable
emotional behaviour and the elements of human personality.
Human Motion and Psychological Research
Human personality, emotional behaviour and body language are the essences in
the recognition of a believable synthetic agent. An intelligent-like agent shall
-
3
possess unique characteristics and furthermore adapt to users characteristics to
perform various tasks. With the intent of modeling autonomous animated
characters, new techniques for supporting agent design including the analysis
method of psychological factors, character movements, appearances and
behaviours will be investigated for a suitable conceptual support.
Animation Research
Animations tell stories and communicate emotions in screenplays, storyboards,
and games. Animated characters work well by dramatising and caricaturing
realistic movements and appearances. Popular animated characters successfully
engage the audience and create an emotional connection with the spectators.
However, traditional computer animation is typically not interactive. Additionally,
to create an animated agent, a sophisticated animator needs to understand the
story and how a characters personality fits into the story. An animator must have
knowledge of the elements of a screenplay. These elements determine the
personalities, expressions and actions of the characters. It is very difficult for
people with no technical animation knowledge to produce engaging character
models or animations. Although research activities in interactive agent design
have progressed in recent years, procedural animation uses a set of procedures and
rules to control motion and can create motion faster, but it is not good at stylized
movement and doesnt focus on appearance automation. Motion capture
techniques capture motions live, and the characters are animated in real time. The
motion of models are realistic, however, in many instances, motion capture
techniques make the non-realistic character looks too natural, not exaggerated and
spiritless. It is also expensive and difficult to obtain a specific motion and some
captured motions are hard to reuse. In particular, it is difficult to cooperate with
cartoon models. The utilisation of cartoon modeling for animated agents and an
affective computational method for cartoon character construction has yet to
emerge.
Ambient Intelligence Research
Ambient Intelligence (AmI) denotes a paradigm that focuses on improving human
quality of life by integrating intelligent electronic devices transparently to the
presence of people. The vision of AmI is to situate human needs central to
-
4
technology development. As software or device agents evolve into the AmI, new
concepts for agents presentation and delegating control are necessary to minimise
the humans tasks and interventions in the complex and dynamic environment.
Most of the current AmI research, however, focuses on how to embed intelligence
and functionality into the human environment without considering individual
information needs and diverse preferences. It is important to reduce the repetitive
and meaningless uniform actions of an agent. A personalised agent design based
on individual preference that can adapt itself to users characteristics and can be
customised to perform differently for various goals is required to increase the
affective interpersonal connection as an integral part of the Ambient Intelligent
environment.
1.2 Aims of Research This research focuses on exploring effective techniques for extracting personality
and emotion features for a high-level control of character movements. To
examine the applicability of the developed high-level controlling mechanism, new
techniques for the analysis and support of the conceptual design of controlling
animated agents that can be applied for the realms of Narrative Intelligence and
Ambient Intelligence will be investigated.
The primary aims are to develop and implement:
(1) A new postural motion control method for animated agents
A personality model that integrates psychology and human behaviour theories
to support agent design
A Personality and Emotion (P & E) Engine which utilises a high-level
control mechanism, allowing designers to express their design intent through
personality and emotion specification for controlling character motions
(2) An innovative storyboard-structured storytelling method for controlling animated story agents
A story agent that fully utilises the proposed P & E Engine
A story analysis method that provides insights into the characteristics of story
-
5
characters by analysing story scripts and body language descriptions
Techniques for analysing the compositive factors of personality and emotion
to provide a knowledge base for the motions of story characters
(3) A novel personalised agent framework for the Ambient Intelligent environment
A Personalised Ambient Intelligent Agent that fully utilises the proposed P &
E Engine
An affective computational method for cartoon character construction that
autonomously controls the cartoon appearance genres of animated agents
Methods for categorising meaningful idle activities and utilising customisation
functions that allow users to configure the personality of their agent
1.3 Research Questions The following research questions will be investigated:
(1) Animated Agents in Narrative Intelligence
To examine the applicability of the developed high-level controlling mechanism, a
story agent framework for Narrative Intelligent environments is investigated. To
develop a better controlling mechanism for animated agents in Narrative
Intelligent environments, foremost, we have considered the question what makes
a great character? A great story character must have self-contradictory
personalities (Glassner 2004). They have one or more extraordinary admirable
traits or exaggerated emotional reactions than other personae. Therefore, the
following questions are derived:
Is it possible to find relevant parameters to express personality and emotion?
Can they be categorised?
How do personality and emotion parameters affect human motion
categorisation?
Secondly, personality and emotion may vary along the unfolding story plots. The
-
6
transition of a persona must be considered while a synthetic character model is
designed in this study. Addressing the following questions will ensure that the
extensibility of a story character model is comprehensively considered:
How can a story character engine provide unpredictable possibilities?
How can a scalable model of a story characters motion be constructed?
In addition, a good story character must be attractive. In Disney animation,
animated characters are generally appealing. These characters are simply more
interesting due to their exaggerated motions that urge the strength of emotion. In
this regard, the following questions provide guidance:
How can these animated characteristics be modelled to represent an animated
agents motion?
(2) Animated Agents in Ambient Intelligence
To examine the applicability of the developed high-level controlling mechanism, a
personalised agent design framework for Ambient Intelligent environments is
investigated. To extend the functionalities of the research, we investigate issues
arising from the functional and social implications of interacting with an agent by
integrating Narrative Intelligence, psychology and cartoon construction into
Ambient Intelligent technology. The following questions are deliberated:
What kinds of agents have a better human-agent affective relationship?
How can the agent behaviour be devised based on psychological factors?
How can the main motion features of idle activity be categorised for
representing agents individuality?
Furthermore, the shapes of characters can be used to accentuate an aspect of the
characters personality. To explore the element of shape, the following questions
will be considered:
-
7
What elements does the appearance consist of?
Is it possible to integrate the cartoon rules for the appearance of animated
agents?
1.4 Significance and Potential Applications This research examines the connections between human personality, emotion,
character modeling and movement used in a Narrative Intelligent system and an
Ambient Intelligent environment. The chosen approaches are significant steps
towards providing a high-level motion control with insights into agents cognitive
states. The approaches are outlined in this section.
(A) Analysing the compositive factors of personality and emotion to provide a knowledge base for character motions
By studying the parameters of psychological factors that characterise the body
language and posture of a real human, the P & E Engine, a hierarchical fuzzy
inference system for character postures, will be implemented. A number of
approaches that underlie the techniques, including fuzzy logic and Narrative
Intelligence, are examined. With psychology-based fuzzy rules, personalities and
emotions are mapped into the main body zones of an animated character that give
storytelling players or agent designers the chance to control synthetic characters
through high-level personality and emotion controlling mechanisms. The
variations resulting from the differences in the intensity of emotions are also
successfully displayed.
(B) Developing a storyboard-based storytelling method
We construct a story module to facilitate the body language control of a dynamic
story character by using story scripts and action descriptions in a form similar to
the content description of storyboards to predict specific personalities and
emotions of story characters. The story character can consistently perform
specific postures and gestures based on its personality type. Story designers can
devise a story context in our story interface that predictably motivates personality
-
8
and emotion values to drive the appropriate movements of the story characters.
(C) Facilitating the process of designing animated agents
In order to reduce the repetitive and meaningless actions of idle activities, we
devise an integrated agent summarisation scheme that classifies a characters body
languages and behaviour inclinations. This classification provides comprehensive
behavioural descriptions and detailed analysis of an animated agent. This study
produces scalable controlling schemes and provides no-repetitive motion
possibilities for animated agents. This can enhance the presentation of the crowd
simulation by assigning a different personality to each character.
(D) Supporting the animation design process
The P & E Engine can represent the design intents of animators using personality
descriptive terms as a design guideline for a desirable animated character. This
can reduce strenuous animation production efforts in emulating animated agents
on the agents motion database of immersive storytelling, virtual theatre,
interactive computer games or virtual assistants displayed on a range of different
devices, such as PDAs and smart phones.
(E) Integrating cartoon construction for character appearance
To control different appearance genres that represent the characteristics of
character personalities, a personalized AmI agent that matches the user profile in
the form of a cartoon reciprocal agent is implemented. The appearance
parameters of a cartoon reciprocal agent are characterised based on the design
principles of traditional cartoon animation. The cartoon construction method can
be used to apply exaggerated characteristics to the character visual presentation
and provide more dramatic visual effects for agent design. It is expected that this
method can be utilized to create a character database for web agents or game
agents which allow users to choose a like-minded agent.
(F) Creating a tool for training novice animators
The knowledge of the P & E Engine can be used to support conceptual animated
character design and to train novice animators in the pre-production process of
animation production as a character development tool.
-
9
1.5 Thesis Outline Having introduced the research motivation and aims, the remainder of this thesis
is organised as follows.
Chapter 2 reviews related studies on agent representation, presentation methods
and character motion techniques. The discussion focuses on comparing agent
design approaches that are utilised in Narrative Intelligence and Ambient
Intelligence. This discussion provides a foundation for a promising approach
towards a high-level control of agent motions. The last part of the chapter
discusses approaches that aim to bridge the gaps between high-level and low-level
features of animated agents.
Chapter 3 summarises the characteristics of personality and emotion
psychological models, followed by some approaches that analyse human
movement and psychological methods for non-verbal behaviour analysis. This
will be developed to acquire the theoretical models applied in the study. As the
basis for generating personality and emotional behaviour, the challenges in
modeling the personality schemes are explained.
Chapter 4 provides an overview of the framework for a high-level control of
postural of animated agents. The discussion aims to guide readers into the
subsequent chapters in the thesis. In particular, the system architecture will be
presented to describe the utilisation of the P & E Engine in Narrative Intelligence
and Ambient Intelligence domains as outlined in Chapters 5, 6 and 7.
Chapter 5 discusses the design and implementation of the P & E Engine, which is
a hierarchical fuzzy logic system devised for an animated character. The
mappings from high-level psychological factors to the P & E Engine and from this
engine to the Animation and Graphics Engine are elaborated. An overview of
fuzzy logic is provided; a fuzzy system design and the personality and emotion
applications related to fuzzy knowledge are also discussed.
Chapter 6 describes the applicability of the P & E Engine for story agents in a
-
10
Narrative Intelligent system. The requirements of designing a better controlling
mechanism for story characters are analysed. A storyboard-structured method of
storytelling is devised with feature extraction of narratives and semantic
annotation of characters body languages. An integrated summarisation scheme to
form customised semantics for body languages is discussed in detail.
Chapter 7 introduces a personalised agent scheme which is an extension of the
functionality of the P & E Engine for an Ambient Intelligent environment. The
scheme aims to demonstrate the benefits of using personality and emotion-based
modeling techniques to support the appearance and activity of an agent. An
integrated agent summarisation scheme can facilitate additional features and
semantics to enhance the agent performance and acceptability in future work.
Chapter 8 reports an evaluation of the P & E Engine and the applicability in both
Narrative Intelligent and Personalized Ambient Intelligent Agent frameworks to
verify the effectiveness of the developed approaches.
Chapter 9 summarises the major achievements of this thesis and analyses the
limitations of these approaches, followed by some directions for future work.
-
11
CHAPTER 2 ANIMATED AGENTS
Various agent concepts have been proposed through the years. However, what do
we need agents for? How do we ease the nuisance of agents and make users feel
connected? It is anticipated that in the future an agent will be able to recognise
people, adapt itself to people by learning from peoples behaviours and possibly
predict peoples needs. The repetitive tasks and learning process of an agent
should be behind stage to minimise the interference. Therefore, in order to
design a useful agent, we have to comprehend what type of agents is attractive to
users; further, an agent needs to recognise what kind of person it is interacting
with.
This chapter reviews previous agent studies and analyses of peoples relationships
with animated agents. Section 2.1 discusses agent representations as well as
Artificial Intelligence theories of agent collaboration. Section 2.2 deals with the
variety of Narrative Intelligent theories and the agent design in Narrative
Intelligence. Section 2.3 covers both the theoretical and practical aspects of
Ambient Intelligence, all the development phrases of AmI systems and the issues
of agent design in Ambient Intelligence. Section 2.4 discusses the presentation
methods of an agent. Section 2.5 analyses the techniques of motion control by
articulating classical animation and character motion techniques. Section 2.6
concludes the chapter by summarising the motivation of this research.
2.1 Animated Agent Representation Numerous workshops and conferences over the last decade have addressed the
topics of emotional agent, social agent, anthropomorphic agent and embodied
conversational agent. In the first part of this research, we devise the Personality
and Emotion Engine to facilitate a high-level control of agent movements. In
-
12
order to have an insight into the requirements of agent design, we examine the
related research in terms of agent functionality, capability and modality in the
social relation aspects.
2.1.1 Agent Functionality and Capability Many agent concepts have already been tested as practical applications in diverse
areas to facilitate human task. Device agents inhabit in PDA, smart phone, and
meeting equipments to increase work efficiency, such as mobile agents (Will et al.
2004) and web agents for content searches in novels, news, receipts and
information (Kuno & Sahai 2002; Billsus & Pazzani 1999). An agent can
function as a learning companion (Chan 1996) or a tel-home health carer (Lisetti
et al. 2003) that fulfills various roles within an electronically mediated learning
and training environment for young children or elders.
Aarts et al. (2003) suggested that agents shall embody personalised, adaptive and
anticipative capabilities with which they can be tailored towards user needs,
change in response to the user, and anticipate the users desires with conscious
mediation. Therefore, agents shall be designed with different characteristics
including special identification, gender, role, cast and cognition. In Figure 2.1, the
body capability of an animated agent is summarised in terms of body and mind.
The agents mind shall embody memory, knowledge, specific personality and
language capability. The agents facial expressions shall embody the capability of
expressing nonverbal communication of their mental-state, perception and sensing.
The body of the agent shall display physical traits, embody the physiological
needs, as well as perform the locomotion and nonverbal communication ability.
2.1.2 Agent Modality The research on animated humanoid agents embraces various fields: psychology,
social psychology, Artificial Intelligence, animation principles, entertainment
industry and Human Computer Interaction (HCI). The nature of agents can be
classified into three major modalities: social agent, affective agent, and story
agent, which are largely overlapping in many aspects.
-
13
Figure 2.1: Body and Mind of Character Capability
(1) Social Agents
Social agents were designed to display social cues or to recognise affective cues
and to engage peoples social cognition. Researchers in social cognition and
social computing demonstrated an interest in not only studying social responses in
human-computer interaction but in building artifacts that provoke these responses.
In Social Reactions to Communication Technology (SRCT) research, Reeves and
Nass (1996) demonstrated in a seminal work for social computing that people
responded in social ways to computers (Reeves & Nass 1996; Nass et al. 1995).
Their studies investigated politeness behaviour, proximity effects, and gender
effects. Human characteristics are assigned to computers in which human-
computer and human-human interactions are found similar. People would
respond to computer personalities in the same way they would respond to similar
human personality (Nass et al. 1995). The perspective of Media Equation
summed up that computers are social actors (Reeves & Nass 1996). This work
inspired the Microsoft Office assistants and Bickmores relational agents
(Bickmore 2003). Other examples of work in this area include the conveyance of
personality and impression management through gaze behaviour (Garau et al.
2003), gesture (Zhao 2001) and the use of social protocols for meeting
management (Yan & Selker 2000).
Mind Body
Intelligence/cognitive modeling Knowledge Goal/desire (motivation) Memory Learning Reasoning (decision
making) Emotions/Mood and
Personality Attitude: like/dislike Affiliation needs
(interpersonal relationship)
Esteem needs Natural language
processing Speech recognition and
synthesis
Physiological needs: Hunger, thirst, fatigue, safety Physical traits Figure animation Nonverbal communication
Body language (Conscious / subconscious) Posture Gestures Social behaviour Senses
Touch Locomotion Path finding
Nonverbal communication
Facial, gaze Animation Sense Smell Perception Hearing Taste
-
14
(2) Affective Agents
Affective agents encompass Emotion Agents and Embodied Conversational
Agents (ECAs), which were designed to display and recognise affects in users or
to manipulate the users affective state. As artifacts will become ubiquitous, the
capability for them to establish social affiliation with humans will become
increasingly important (Bickmore & Cassell 2001). Several researchers
developed technologies for sensing user affects through a variety of physiological,
nonverbal and verbal channels, including facial expression, postures, galvanic
skin response, muscle contraction and speech (Cassell et al. 1999). Other
researchers also developed systems for displaying affective signals using a variety
of modalities, including speech, facial expression, motion dynamics and natural
language text (Hovy 1990). For instance, Tosas Neuro Baby, the simulation of a
baby, is an automatic facial expression synthesiser that responds to the
expressions of feeling in the human voice through recognising emotions and
feelings (Tosa 1994) (Figure 2.2). Tosa used Neural Networks to model an
artificial baby that reacted in emotional ways to the sounds made by a user
looking into its crib.
(Image adapted from www.tosa.media.kyoto-u.ac.jp)
Figure 2.2: Neuro Baby
The lack of affective connection between agents and humans can result in poor
usage of agent technology. Some research improves this type of problem by
adding emotion expressions. Several conversational systems were developed that
attempted to convey emotion. Embodied Conversational Agent (ECA) and Social
Intelligent Agent communities offered a good overview of issues relating to
emotional agents in social affiliations. ECA research analysed human discourses
into task, communication and relationship categories for developing a better
interpersonal relationship between human and agent (Cassell 2001). Examples
-
15
were the pedagogical agent (Lester et al. 2000) , robot soccer commentators
(Andre et al. 1997) and sport broadcaster (Bui 2004). ECAs used speech, gesture,
intonation and other nonverbal modalities to emulate human conversation with
their users. The MIT Media Labs autonomous conversational kiosk MACK
was a virtual receptionist that could give directions and talk about the differences
among various research groups (Cassell et al. 2002). Bickmores relational
agentREAwhich played the role of a real estate agent emulated the
experience of face-to-face conversation (Bickmore 2003) (Figure 2.3).
(Image adapted from
http://www.media.mit.edu/gnl/projects/humanoid/siggraph99/images/interaction_bedroom1.jpg)
Figure 2.3: REA Project
Other researchers proposed methods that related personality and emotion to the
modeling of agents behaviours. Allbeck and Badler (2002) presented work
toward representing agent behaviours modified by personality and emotion. Rosis
et al. (2003) integrated personality to an agent to modify the way it feels and
shows emotions. Ball and Breese (2000) described a sophisticated system for
recognising users affects and personalities. By using a Bayesian belief network,
they generated affect and personality (dominance/ submissive) using a variety of
behavioural cues including vocal cues (average pitch, pitch range, speech speed,
speech energy), verbal cues (active, positive, strong, terse or formal aspects of
lexical choice), facial expression, gesture (speed and size) and postural
information.
-
16
(3) Narrative Agent/ Story Agent
Narrative agents involve a range of research fields including natural language
processing, narrative, cognitive science, and computer games research. Story
agents as synthetic actors that emerge mostly in character-centric models are
designed to support story structure, to improvise in an interactive multimedia
environment or to emulate human storytelling. These applications will be
elaborated upon in more detail in the following section.
2.2 Agent Design in Narrative Intelligence In the second part of this research, we apply the Personality and Emotion Engine
into the realm of Narrative Intelligence for controlling the body languages of a
story agent. In order to understand the essentials of story agent design, we
examine the variety of Narrative Intelligent theories and the agent design in
Narrative Intelligence.
2.2.1 Narrative Intelligence Story communicates facts, provides answers to questions, and makes its audience
feel various emotions. Story and narrative have long been of great interest to AI
researchers. Seminal research at Yale explored the issue of the knowledge
structures and processes that a human uses to understand the meaning of natural
language (Schank & Reisbeck 1981; Schank 1990). In a series of programs,
Shanck and colleagues developed a theory of the knowledge structures to
understand textual narratives. The story understanding system, known as SAM,
used scripts to capture the notion of stereotyped situations or contexts
(Cullingford 1981); and the other story understanding system, know as PAM,
incorporated a notion of the goals held by characters in a narrative and the means
they had to accomplish these goals (Wilensky 1981). Since then, narrative has
stimulated a general move towards an interdisciplinary engagement with the
humanities. For example, in human-computer interface design, the research focus
moved from hardware design to viewing an interface as a computer-human
dialogue (Mateas & Sengers 2003). A number of AI researchers believed that
studying the narratives of AI can lead to a better self-understanding for AI, and, in
-
17
turn, yield better AI research (Agre 1997; Sengers 1998). Blair and Meyer (1997)
then called the coalescence of AI research and narrative "Narrative Intelligence".
Narrative Intelligence encompasses diverse research, such as interactive drama,
virtual drama, interactive cinema, virtual theatre, immersive storytelling, and
emergent storytelling. Generally, such nonlinear storytelling research can be
divided into three major groups (Glassner 2004; Bailey 1999): authoring, story,
and character-based models. These NI researchers aim to address the problem of
generating interactive narratives, and different narrative design approaches for
user experience.
(1) Authoring Model
An authoring model describes the process of creating a story from the perspective
of an author (Crawford 2001), such as story understanding systems which model
the processes by which a human understands a story. This is also called
interactive fiction, and involves a text-based system. Characters are very simple
or non-existent. In the interface of an authoring model, the user inputs commands
that are provided with text descriptions of the world to make connections between
the stories, background knowledge and, possibly, models of story event
importance. For example, Grabson and Braun (2001) allowed for authorial
control at all levels while generating a large variety of plots. By a morphologic
approach to interactive storytelling, the higher level guidance was the primary
concern of their design.
(2) Story Model
A story model structures the grammars of stories and tells stories by manipulating
the structure of plots and grammars. A good story design is based on what kind of
story the author is trying to tell. Plots appear as puzzle simulations. Solving one
puzzle allows the user to access to the next puzzle. A story model includes story
database systems and interactive storytelling systems. Story database systems
allow users to access databases of stories. For example, Schank (1997) developed
a model of the interrelationship between stories and memory, to describe how
stories were understood and how they were recreated from the substance of stories.
He built a training system that contained a database of stories describing how
-
18
people handled commonly occurring problems. The stories were triggered by the
system when the trainee faced a similar situation. Rumelhart (1975) used a story-
centric approach by capturing the notion of story as a story grammar. Moreover,
Hall (2002) built a story model based on a story grammar that represents conflict
plots.
Interactive storytelling systems model the structural properties of stories to tell a
story, explore the nature of interactivity and the structural possibilities of
interactive narrative (Murray 1998). Creators of an interactive story model lay out
all the branching options in flowcharts. The interactivity of a computer system is
based on the dialogue established between the system and the users. The system
allows the audience to pick one of the predetermined structures in which the
characters are few and simple and may not directly interact with the audience
(Wehyrauch 1996, 1997). The system collects and evaluates information and then
directs the program to trigger appropriate events, to display an image or to play a
sequence of images or sounds. For example, Szilas (2002, 2005) presented a
narrative model for interactive drama to simulate the narrative on a deep level and
allow the user to interact with it. Sgouros and colleagues (1996) and Sgouros
(1999) developed a framework for plot control in interactive story systems. Braun
and Schneider (2001) and Braun (2003) presented the story-engine that focused
on storytelling in a collaborative augmented reality environment. Young (2000,
2001) and colleagues (2004) created interactive narrative structures by integrating
plan-based behaviour generation with interactive game environments.
Some work has focused on systems that provide a high-level plot guidance to
believable agents. For example, Loyall and Bates (1997) and Weyhrauch (1997)
built the agent architecture in the Oz Project, built a dramatic guidance system
that issued high-level commands to Oz believable agents. Galyean (1995) and
Galyean and Blumberg (1995) examined the plot on the cameras and transitions to
express how storyline changed the presentation of the scene. This research
suggested the concept of plot level and presentation level by exploring the area of
interactivity as it was influenced by a story. By exploring an alternative approach,
Elliott (1992) and with colleagues (1998) used a fixed script and told different
stories by narrating the stories with different emotional emphases. Elliott's
-
19
Affective Reasonera cognitive appraisal model of emotiongenerated the
emotional behaviour of the narrative agent. The work demonstrated that a
storytelling system could embody the interpretive capabilities of a human
observer by understanding motivations and emotions.
(3) Character-centred Model
A character-centred model devises the goal and plan of a character, which is also
called interactive cinema or interactive drama overlapping the scope of interactive
storytelling systems in the story model. The difference is that the plot generation
is based on the behaviour of autonomous actors built in a graphical world
(Crawford 2001; Cavazza 2002; Mateas & Stern 2000). The character-centred
model allows the audience experiencing a story to be an interactive participant
through standard input peripherals. These peripherals determine the position,
orientation, and physical gesture of a person via a mouse, joysticks, and
keyboards, or specific gloves and bodysuits with ultrasonic and light sensors. As
the character-centred model devises the aims and intensions of characters,
storylines result in characters actions and interactions along with the unfolding
story. For example, Reilly and Bates (1992) and Reilly (1996) built the believable
autonomous agents that exhibited rich personalities and expressed their emotional
behaviours in interactive dramas. Blumberg and Galyean (1995) and Galyean
(1995) explored the concept of a director giving commands to autonomous
characters at multiple levels of abstraction. They built a system that used
cinematic techniques focusing on tracking the user's progress through a fixed plot,
using user actions to trigger the next part of the story. Machado (2000) and
colleagues (2001) presented interactive story-creation activities by using real
characters in virtual stories. Crawfords Erasmatron system (2001) was based on
a sophisticated world model. It sought to balance character-based and plot-based
approaches by using verbs as the basic components of action. The author created
a set of verbs that the engine could work with.
2.2.2 Narrative Agent Design Several researchers have argued that AI systems will be more understandable with
narrative presentation extending to systems involving intelligent agents (Sengers
1999). The agents will be more comprehensible if their visible behaviour is
-
20
structured into narrative (Lester & Stone 1997). Therefore, most of the work in
interactive drama has been improved from an autonomous agent perspective. The
focus has been on building believable agents that can play roles in stories.
Narrative agent related design could be categorised into three aspects, namely,
agent using narrative structure, appearance of narrative agent, support human
story.
(1) Agent using narrative structure
Synthetic actors embody internal capabilities with which they can use narrative
structures to improvise as one of the creative roles in an interactive multimedia
environment. For example, Hayes-Roth and colleagues Virtual Theatre project
(1995, 1996) allowed users to improvise in games played by the agents. The
project aimed to provide a multimedia environment in which users produced and
performed stories in an improvisational theatre company. Agents improvised
activities around a fixed script and collaborated on the creative process. Each
time the actors performed a given script or followed a given direction, they may
improvise differently. Thus, users enjoyed the combined pleasures of seeing their
own work performed and being surprised by the improvisational performances of
their actors. The roles included producer, playwright, casting director, set
designer, music director, real-time director, and actor.
In their master/servant scenarios, they studied how two autonomous agents
interacted with one another without human intervention (Hayes-Roth et al. 1995).
The master and the servant each had knowledge about the environment and their
status within it. These scenarios tested their behaviours under computer-
controlled stimuli and emotional variations. Figure 2.4 shows the master, Otto,
with his servant, Gregor. Additionally, Cavazza (2001, 2002) intervened an
onscreen virtual character in interactive storytelling. By integrating the
underlying theory of ethology, Blumbergs virtual dog-Szila (1997), was a full-
body interaction system that allowed the user to interact with some dog tricks. In
the World of Oz system, three characters used narrative structure to perform their
plays. The fourth character can be controlled by the user to join the play. The
story was then generated from the interaction of the characters (Loyall & Bates
1991; Loyall 1997).
-
21
(Image adapted from http://ksl-web.stanford.edu/projects/cait/demos/status.html)
Figure 2.4: Virtual Theatre
(2) Appearances of Narrative Agent
A good narrative agent design mirrors the process of identification and empathy
involved in the viewing of the story. A good character is generally appealing but
may not need to be realistic. If it has contrasting shapes and is fun to look at, it
will hold an audiences interest. The attractiveness of a character is determined by
the dynamics of anticipations and outcomes. Narrative agents or animated
humanoid software agents are usually designed in three kinds of appearance:
photorealistic virtual human, anthropomorphic agents and non-physical animated
agent. With the rapid development of computer graphics technology, researchers
are increasingly paying attention to making the interaction more adaptive, flexible,
and human-oriented. The photorealistic virtual human research supervised by
Magnenat-Thalmann and Thalmann (1998) is the pursuit of the photorealism
appearance. Balder and Webber (1993, 1997) also directed their attention on
realism such as the overall design of their animated agent Jack.
Anthropomorphic agents are designed to have a humanoid physical form which
can be a caricatured animal or object, such as the virtual dog created by Blumberg
and Galyean (1995). Blumberg (1997) built a virtual dog and also focused on
building architectures to support the construction of characters. The World of Oz
system was called the Edge of Intention and contained three ellipsoidal
creatures called Woggles (Bates et al. 1992; Loyall & Bates 1997). Mateas and
Sterns (2002) and Stern and Franks (1998) Virtual Babyz and Virtual Pets
projects described agents that were designed to allow a narrative structure to
emerge from their behaviour as they acted over time. Behaviour language was
also devised to support these story-based believable agents (Figure 2.5).
-
22
(Image adapted from http://www.interactivestory.net/papers/PetzAndBabyz.html)
Figure 2.5: Virtual Babyz and Pets
It is worth noting that several studies have been undertaken to determine if the
presence of a face or body in the interface has a significant impact on user
attitudes or behaviour. Koda and Maes (1996) and Takeuchi and Natio (1995)
studied interfaces with static or animated faces and found that users rated them to
be more engaging and entertaining than functionality equivalent interfaces
without a face. Kiesler and Sproull (1997) found that users were more likely to be
cooperative with an interface agent when it had a human face (versus. a dog or
cartoon dog).
However, many agent studies related to human and computer interaction were
done via text, mouse, and keyboard. These non-physical animated agents, such as
a kiosk agent and interface agent, varied greatly in their linguistic capabilities and
mouse/text/speech input modalities. Lui et al.s interface agent (2003) analysed
affective tones from natural language text. Lester and colleagues interface agent
Cosmo (1997) acted as a pedagogical agent that was perceived as helpful,
believable, and concerned.
(3) Support Human Storytelling
A key aspect of animated agents is that they are artifacts designed to support and
help people to finish their tasks. Since stories are an important part of human life,
several researchers, most notably in Cassell's Gesture and Narrative Language
Group at the MIT Media Lab, built systems that supported people in telling stories
to one another. Cassell and Bickmores Small Talk (1999) and Cassell and
-
23
Vilhjalmssons BodyChat (1998) developed ruled-based generation of
synchronised speech, intonation, facial expressions, eye gaze and hand gestures
for multiple conversational agents (Figure 2.6). This research specially focused
on hand gestures (e.g. hand shape, wrist control, and arm positions) that concurred
with spoken language (Cassell & Bickmore 1999).
Moreover, Cassell et al.s SAM (2000) was a peer embodied conversational
storyteller who shared a real castle play space and a set of story-evoking toys with
children (Figure 2.7). Ryokai's and Cassells Storymat (1999) recorded and
played back stories that people had told. Others, like Umaschi (1997) utilised an
intelligent embodied soft toy to build a personal storytelling environment. Bers
and Cassells SAGE Storytellers (1998) allowed children to use technologies to
explore language and to create their own interactive storytellers.
(Image adapted from http://web.media.mit.edu/~justine/research.html)
Figure 2.6: Body Chat Project
-
24
(Image adapted http://www.media.mit.edu/gnl/projects/castlemate)
Figure 2.7: SAM Project
This research does not attempt to provide a model of generating story structure in
depth. Instead, we investigate a character-based interactive narrative approach,
focusing on providing a high-level control mechanism to direct the movements of
an affective story character. We aim to automate character motion, focusing on
body posture by controlling high-level psychology factors.
2.3 Agent Design in Ambient Intelligence In the third part of this research, we apply the Personality and Emotion Engine in
the realm of Ambient Intelligence for controlling the motion and appearance of a
personalised agent. A personalised AmI agent is devised to match user profile in
the form of a cartoon reciprocal agent. Therefore, we review the theoretical and
practical aspects of Ambient Intelligence and the issues for agent design in
Ambient Intelligence as follows.
2.3.1 Ambient Intelligence Ambient Intelligence emerges from diverse areas such as ubiquitous (Weiser
1993), mobile (Satoh 2004), persuasive (Weiser 1991), wearable (Trivedi et al.
2000) and human-centered computing for a seamless communication environment.
Current AmI research encompasses interdisciplinary research areas such as the
technological, scientific and artistic fields to create an embedded and distributed
support (Norman 1999; Aarts et al. 2002; Shadbolt 2003). Various disciplines can
-
25
be grouped under the umbrella of AmI, including distributed intelligence,
hardware design, information understanding, social learning and ethical
implication. These research areas that aim to propose frameworks and sensing
strategies for dynamic environments and complex scenarios are explained as
follows.
(1) Distributed Intelligence
Distributed intelligence is possible if a seamless communication and sensor
infrastructure can be underlain. Distributed solutions were implemented for
updateable models of the scenes and generate patterns of intelligence to migrate
over hardware layer (Marcenaro et al. 2003; Nijholt 2