Computational Support for the Evaluation of
Facial Expressions in Photographs
by
Rachel Klingberg
Submitted in partial fulfillment of the requirements for the
Master’s degree in Computer Science
at
The Seidenberg School of Computer Science and Information Systems
Pace University
May 2013
i
Abstract
Computational Support for the Evaluation of Facial Expressions in Photographs
by
Rachel Klingberg
Submitted in partial fulfillment of the requirements for
the Master’s degree in Computer Science
May 2013
Charles Darwin first proposed that certain facial expressions of spontaneous emotion are
genetically determined and culturally universal in his 1872 monograph The Expression of
Emotion in Man and Animals. His theory of evolution applied not only to physical qualities, but
also to emotional expression, or behavioral attributes. The latter idea was not widely accepted
until the 1960s, when Paul Ekman’s research in neuroscience, psychology, and anthropology
validated Darwin’s theory of evolutionary behavior. Drawing on Ekman’s pioneering research
and his comprehensive taxonomy of muscular actions of the face, this thesis describes research
into the development of computer software aiding the evaluation of facial expressions in
photographs. The software program described here reports the presence and intensity of seven
universal emotions: anger, contempt, fear, happiness, sadness, and surprise. A Web form
prompts the user to match the photo’s appearance with a simple visual lexicon of the various
muscular actions of the brow, eyes, nose and mouth. This input is compared to a relational data
set of all possible actions of the facial muscles for all seven universal emotions. Using a
weighted system of scoring, output reveals the presence and intensity of each of the emotions
expressed by the photographic subject.
ii
Acknowledgements
I express my sincere gratitude to Dr. Richard Nemes for his guidance with my thesis and with
my prior coursework for the MS in Computer Science. I also wish to thank Dr. Catherine Dwyer
and Dr. Howard Blum for serving on the thesis committee, as well as Pace University for
sponsoring my education with the Staff Scholarship program.
iii
Table of Contents
Abstract ………………………………………………………………………………………... i
List of figures…………………………………………………………………………………... iv
I. Introduction to Pathognomy …………………………………………………………...… 1
1.1 A history of the field …………………………………………………………………… 1
1.1 Ekman et. al.: FACS and FACE ………………………………………………………... 7
1.2 Affective computing ……………………………………………………………………. 9
1.3 Resolution of the universality debate …………………………………………………... 11
II. General Approach to the Problem of Evaluation ……………………………………… 13
2.1 FACS, Aranatomy, and commercial software …………………………………………. 13
2.2 Expressions as sets …………………………………………………............................... 15
2.3 A lexicon of expressions of basic emotions ……………………………………………. 16
III. SETL, a Procedural Approach ………………………………………………................ 19
3.1 SETL analysis with set operations …………………………………………………........ 19
3.2 Testing input and scoring results ………………………………………………….......... 20
3.3 User considerations …………………………………………………............................... 21
IV. SQL, a Relational Approach ………………………………………………................. 22
4.1 The table …………………………………………………............................................... 22
4.2 The application …………………………………………………..................................... 24
4.3 Testing with naïve users …………………………………………………....................... 27
4.4 Scoring …………………………………………………................................................. 27
4.5 Evaluation of results …………………………………………………............................. 35
V. Conclusion ……………………………….….….….….….……...………………………... 37
References …………………………….….….….….….……...……………………………… 69
Appendix A: Lexicon for SETL Analyzer ……………………………….….….….….……… 38
Appendix B: SETL Program to Analyze Facial Expressions…………………………….....… 45
Appendix C: Form to Accept Input for Analysis of Facial Expressions………………....…… 54
Appendix D: SQL/ColdFusion Program to Analyze Input and Report on
Presence of Facial Expressions…………………………………………….....… 60
iv
List of Figures
Fig. 1 Duchenne using electrical stimulation to provoke a smile ……………………….. . 2
Fig. 2 Paul Ekman demonstrating the Duchenne and non-Duchenne smiles …………… . 3
Fig. 3 "The Cartesian Theater" ……………...………………...………………...………. . 6
Fig. 4 The seven universal emotions ……...………………...……………...…………… . 8
Fig. 5 The Artanatomy depiction of anger ……...………………...………………...…… 16
Fig. 6 Web form to accept input for the analysis of facial expressions ……...………...... 24
Fig. 7 Expression of anger ……...………………...………………...…………………… 28
Fig. 8 Expression of anger/contempt ……...………………...………………...………… 28
Fig. 9 Expression of contempt ……...………………...………………...……………….. 30
Fig. 10 Expression of disgust ……...………………...………………...…………………. 30
Fig. 11 Expression of fear ……...………………...………………...…………………...... 32
Fig. 12 Expression of happiness ……...………………...………………...……………..... 33
Fig. 13 Expression of sadness ……...………………...………………...…………………. 33
Fig. 14 Expression of surprise ……...………………...………………...………………… 34
1
I. Introduction to Pathognomy
1.1 A history of the field
What causes us to feel emotions? Why do we feel certain emotions so strongly? And how do we
cope with the intensity of our feelings? Throughout history, philosophers and physicians have
debated these questions. Since the 19th
century, the entire discipline of psychology has been
devoted to them. Because of its association with our capricious feelings, the very word
“emotion” implies the opposite of scientific rigor, yet facial expressions of emotion are as
universal to humanity as any other aspect of muscular physiology. The study of facial
expressions of emotion has been a well-established discipline since the 1960s, but it has no
formal name in modern vernacular. The 18th
-century term is pathognomy:
Pathognomy (archaic): Expression of the passions; the science of the signs by which
human passions are indicated. 1793 HOLCROFT Lavater’s Physiog. ii. 24: Pathognomy is
the knowledge of the signs of the passions. [1]
Pathognomy falls under the umbrella of non-verbal communication, a field of biology,
psychology, and cognitive neuroscience that also includes gestures, tone of voice, posture,
movement, positioning, and body cues, as well as changes in heart rate, temperature, pupil
dilation, and other involuntary responses to intense feeling. The science of pathognomy is only
concerned with the anatomical facial expressions of emotion, not with the stimulus or response,
since the latter cannot be categorized. To understand how the expression of human emotion can
be approached with scientific rigor, as a biological function universal across cultures and nearly
identical for all of humanity, it is necessary to entirely disregard the stimulus and response to
emotion that is the essence of psychology, and focus instead on the mechanics of facial
expression, from gross motor action of the muscles to the subtlest trigger of the brain’s
amygdala. From this perspective, emotional expression is a genetic attribute, subject to the same
natural selection and inheritance as other biological attributes. To validate this concept of
emotion expressed universally among all human beings, I wrote two software applications to
demonstrate that facial expressions of seven basic emotions can be modeled as sets and
manipulated computationally with set operations – a programmatic interpretation of ideas first
espoused by eighteenth-century scientists.
A brief survey of the field of pathognomy is necessary to lay the foundation for approaching
emotional expressions algorithmically. Pathognomy is concerned only with the physical
expression of emotion, because the human experience is far too diverse for the study of
emotional stimulus or response to be a precise science, even at the most basic level of survival.
For example, there are divers who can hold their breath for three minutes, although for most
people, being unable to breathe for more than one minute incites distress. Some people find
activities such as mountain-climbing or swimming exhilarating, while others are terrified by
heights or unable to swim and find those same activities life-threatening. Books like Ben
Sherwood’s The Survivors Club examine why some people endure extreme situations, such as
plane crashes, while others in the same challenging environment perish. Even the simple
pleasure we take in eating when hungry, an emotion that ensures our daily survival, is not
elicited by the same stimuli across all cultures, across smaller groups of people within the same
culture, or even within the same individual at different times. A familiar example is the family
gathering attended by someone who has recently become a vegan, and is now disgusted by the
2
sight and smell of meat, although the very same person was once delighted at spareribs grilling
on the barbeque.
Pathognomy has for centuries been of interest to actors and artists seeking to accurately portray
the human experience, and the Age of Reason brought the field close to the science it is today.
An eighteenth-century French scientist, Duchenne de Boulogne, first advanced the notion that
facial expressions were a biological component rather than a voluntary form of expression,
nudging the field from the realm of artistic consideration into a subject to which rigorous
scientific procedure could be applied. Using electrical stimulation on a patient with very little
facial nerve sensation, Duchenne provoked muscular actions associated with recognizable
emotions (see figure 1). He photographed and catalogued these expressions, noting which facial
muscles were involved in each expression, and determining which were controlled voluntarily
and which were not. In 1862, he published his findings as Mécanisme de la physionomie
humaine, a ground-breaking contribution to the field of neurology. A sincere smile, which most
people can instinctively distinguish from a false one without any special training, is often
referred to as a “Duchenne smile.” A Duchenne smile involves contraction of both the muscle
that raises the corners of the mouth (zygomatic major) and the muscle that raises the cheeks to
forms crow's feet around the eyes (orbicularis oculi). The latter action cannot be easily
controlled voluntarily. The “Duchenne smile” was the standard for a sincere smile for decades,
but more recent research has revealed that other attributes, such as symmetry of expression and
brevity of duration, are more reliable indicators of sincerity [2]. Nevertheless, when compared
side-by-side, a Duchenne smile is instantly recognizable as a happier visage than a non-
Duchenne smile (see figure 2).
Fig. 1 Duchenne using electrical stimulation to provoke a smile, from his 1862 Mécanisme de la physionomie humaine.
3
Fig. 2 Paul Ekman demonstrating the Duchenne (at right) and non-Duchenne smiles. Source: Ekman, Paul. Emotion in the Human Face. Cambridge: Cambridge University Press, 1983.
Duchenne was seeking a divine reason for human facial expressions, and proposed that “It
sufficed for Him to give all human beings the instinctive faculty of always expressing their
sentiments by contracting the same muscles. This rendered the language universal and
immutable.” [3] Though his interpretation of the universality of facial expression as the divine
birthright of humanity is no longer the realm of science, his basic premise is one with which
most contemporary neuroscientists would agree: the expression of emotion is universal. Charles
Darwin was inspired by Duchenne’s research, and in 1872 he published The Expression Of The
Emotions In Man And Animals, in which he wrote, “The young and the old of widely different
races, both with man and animals, express the same state of mind by the same movements.”
Darwin provides examples, illustrations, and photographs of a wide array of humans and animals
– including babies, adults, actors, asylum patients, primates, dogs, cats, horse, and cows -
expressing emotion, or “heredetary [sic] animal movement.” He also describes the function of
emotional expression as very different from that of intentional communication via voluntary
expression. In his Notebook M, Darwin observes that horses fight with the equine aggression cue
of lowered ears, even when they turned away to kick each other, “although it is then quite
useless,” because the other horse would not be able to see the lowered ears. Darwin perceived
emotional expression not merely as a means of intentional communication with others, but an
innate aspect of experiencing emotion, and therefore less intentional than inherent and
unconscious – an evolutionary rather than learned behavior.
The typical Victorian sensibility was more inclined to view emotion as an intellectual faculty, or
spiritually motivated, than as genetically pre-determined. Yet The Expression of Emotion was
well-received by the public. Richly illustrated and written in a style easily understood by
laypersons, it was regarded as part of the evolutionary theory described in The Descent of Man,
which was quickly gaining acceptance in scientific communities. Although natural selection for
physical traits was well-established as a scientific fact in Darwin’s own lifetime, his theory
regarding the genetic determination of emotional expression fell out of favor towards the end of
the 19th
century, pushed aside by the birth of psychology as a discipline separate from biology.
Freud’s ideas about the libido and subconscious are a return to the Cartesian mind/body duality
of previous centuries, attributing emotional expression to thought, memory, and experience, and
4
not to genetics. And cultural relativism, a popular approach to anthropological research in the
20th
century, supports the idea that emotional expression is entirely learned, and therefore differs
from culture to culture just as language or social customs differ. Even today, it is widely believed
that emotions are learned, and that, for example, babies learn to smile by seeing their parents
smiling at them - a conscious learning effort, much as they learn to speak and read.
Consequently, as with language, smiles ought to differ from culture to culture. Yet they do not,
as discussed below where the universal emotions debate is further explored. Darwin’s own
experiment with his infant son entailed prohibiting smiling or laughing around the boy to see if
he would learn to smile on his own. He did, at 45 days old, leading Darwin to conclude that
smiling is innate and not socially learned [4]. Although there is no way to know whether
Darwin’s nursemaid never smiled at his infant son, we now know that babies smile in their sleep
from the day they are born, and even blind infants smile, contradicting the popular notion that
babies learn to smile in response to the beaming faces of their parents.
If we were to define emotional facial expressions as an entirely cognitive response, the sequence
of events would proceed like this: something pleasurable occurs; we then access our learned
dictionary of appropriate facial responses; following that, we access the motor control for those
facial muscles; finally, we issue a smile. That may seem to be the natural sequence, but studies in
a wide array of scientific fields have demonstrated that facial expression occurs not in response
to, but simultaneous with or even prior to, the felt emotion [5]. It may be that the body
commands the mind and not vice-versa when it comes to expressing emotion. Or as the
philosopher and psychologist William James stated in an 1884 essay, “What is Emotion?”:
Common sense says, we lose our fortune, are sorry and weep; we meet a bear, are
frightened and run; we are insulted by a rival, are angry and strike. The hypothesis here to
be defended says that this order of sequence is incorrect, that the one mental state is not
immediately induced by the other, that the bodily manifestations must first be interposed
between, and that the more rational statement is that we feel sorry because we cry, angry
because we strike, afraid because we tremble, and not that we cry, strike, or tremble,
because we are sorry, angry, or fearful, as the case may be. Without the bodily states
following on the perception, the latter would be purely cognitive in form, pale,
colourless, destitute of emotional warmth. We might then see the bear, and judge it best
to run, receive the insult and deem it right to strike, but we could not actually feel afraid
or angry.
“Refuse to express a passion, and it dies,” wrote William James [6]. Darwin, too, suggested the
same in The Expression of Emotion: “The free expression by outward signs of an emotion
intensifies it. On the other hand, the repression, as far as this is possible, of all outward signs
softens our emotions... Even the simulation of an emotion tends to arouse it in our minds.” This
idea that physiological responses to stimuli cause emotion – in other words, that smiling causes
us to feel happy rather than happiness causes us to smile – was also independently espoused by
another 19th
-century psychologist, Carl Lange, in an 1885 work, On Emotions: A Psycho-
Physiological Study. It became known as the James–Lange theory, and is still debated today.
Eric Finzi, a cosmetic surgeon who observed the effects of Botox treatments on his patients’
moods, described the facial feedback hypothesis in The Face of Emotion:
A man may be absorbed in the deepest thought, and his brow will remain smooth until he
encounters some obstacle in his train of reasoning, or is interrupted by some disturbance,
and then a frown passes like a shadow over his brow.” - Darwin, Expression of
5
Emotion…. So one meaning of a frown is to express displeasure or difficulty. But if you
are sitting all by yourself reading, when that frown passes like a wave over your brow, it
is not at all clear to whom that frown is talking. Unless you admit that maybe that frown
is talking to its wearer.
Finzi’s treatment of depression with Botox provides strong evidence for the facial feedback
hypothesis. Realizing that the negative emotions of anger, fear, and sadness are expressed with
the corrugator muscle, an area between the eyes that draws the brows together, Finzi was one of
several doctors to discover the connection between Botox injections, which suppress the
corrugator, and improved mood of his patients, all of whom were being given other cosmetic
treatments and sought Botox only because they had read about its use in the treatment of
depression. Some had no visible noticeable signs of aging, and others even regarded Botox
treatments as vain and silly, but all suffered from severe clinical depression and tried it out of
desperation, with largely successful results [7]. The emotion and its corresponding facial
expression are so closely connected that the former cannot be present without the latter.
Recent research by Paula Niedenthal also demonstrates the facial feedback hypothesis, in this
case, that people recognize smiles by mimicking them:
In one study, she and her colleagues are testing the idea that mimicry lets people
recognize authentic smiles. They showed pictures of smiling people to a group of
students. Some of the smiles were genuine and others were fake. The students could
readily tell the difference between them. Then Dr. Niedenthal and her colleagues asked
the students to place a pencil between their lips. This simple action engaged muscles
that could otherwise produce a smile. Unable to mimic the faces they saw, the students
had a much harder time telling which smiles were real and which were fake … they
were forced to rely on the circumstances of the smile, rather than the smile itself [8].
Niedenthal’s study illustrates the facial feedback hypothesis – as described by William James
and Carl Lange in the 19th
century. Similarly, “Method” acting uses the facial feedback
hypothesis: actors portray emotional states of their characters by drawing from their own
emotional memories. The best actors do this so effectively that their audiences forget they are
acting. But, in fact, they are feeling the emotions they portray – this is part of Method acting.
Actors voluntarily access a memory to provoke an involuntary reaction – the expression of
emotion. Trial lawyers, poker players, and car salesmen all use facial expressions in their
professions, and while Method acting also serves their purposes, some have acquired great skill
in manipulating their facial expressions. Just as some, but not many, people can wiggle their ears
or raise one eyebrow, some people can isolate individual facial muscles to control the display of
emotion. This is by no means common nor easy to learn – Paul Ekman spent seven years
learning to move each muscle of his face, often resorting to electrical stimulation when he could
not voluntary move it. While Method acting is much easier, it is ultimately an involuntary
expression of emotion, just as thinking about a painful event in the past might provoke sadness
and even tears long afterwards, or a memory of fear will cause a fearful expression. Those
afflicted with post-traumatic stress syndrome know that emotions can be provoked not just by
stimuli, but by memories of them, and ultimately, the response is still involuntary.
There is almost no scientific evidence to suggest that emotions are learned behavior, that they
vary across cultures, but there is still a widespread belief that our ability to understand and act on
emotions is dependent on our recalling past experience. It seems logical: our brains contain an
6
extensive archive of everything that has ever happened to us, and when a stimulus is
encountered, it is compared to past experience, identified, and acted upon. This is the Cartesian
model of the physical body as a vehicle controlled by the deductive mind. Descartes envisioned a
tiny homunculus as a controller, comparing current stimuli to an endless loop of past experience
and initiating the appropriate physical action. Modern science, however, has clearly shown this
to be false. Scientists have mapped two entirely separate regions of the brain responsible for
emotions, feelings, and decisions (the amygdala); and for memory, learning, and language (the
hippocampus). Yet the Cartesian mind/body model remains pervasive and widely accepted, even
among social scientists. Neuroscientist Richard Restak attributes a certain naiveté or superstition
to the idea of the brain as master controller and emotions as learned responses. “I am skeptical
about the possibility of our ever being completely conscious of our emotions … our belief that
we “consciously” determine our fear responses is only a reincarnation of the over-esteemed
homunculus bequated to us for perpetual care by Descartes.” [9] The philosopher Daniel Dennett
dismisses the “Cartesian theatre,” a derisive term he coined in Consciousness Explained:
Cartesian materialism is the view that there is a crucial finish line or boundary
somewhere in the brain, marking a place where the order of arrival equals the order of
"presentation" in experience because what happens there is what you are conscious of.
[...] Many theorists would insist that they have explicitly rejected such an obviously bad
idea. But [...] the persuasive imagery of the Cartesian Theater keeps coming back to
haunt us—laypeople and scientists alike—even after its ghostly dualism has been
denounced and exorcized.
In addition the idea of a “finish line,” the Cartesian theater has another flaw: the mind of the
homunculus would have its own Cartesian theatre, and so on, ad infinitum, which makes the idea
of the ‘controller’ an impossible one (see figure 3). Of course, we now know the amygdala has
no capacity for memory, nor the hippocampus for emotion – neither can be the controller of the
other. Even in the 19th
century, the writings of Duchenne, James, and Darwin contradicted the
Cartesian model, but just as behavioral evolution began to follow physical evolution into the
realm of accepted science, Freudian psychology pushed it aside for nearly a century.
Fig. 3 The Cartesian theater’s response to a frying egg is to sift through memories of past experience until it finds a match. Source: "The Cartesian Theater" by Jennifer Montes CC:BY
7
1.1 Ekman et. al.: FACS and FACE
Within the greater context of his writings on “descent with modification,” and the emergence of
psychology as a legitimate medical and scientific discipline, Darwin’s writings on the genetic
aspects of emotional expression would not receive their due until the 1960s, with the pioneering
work of Paul Ekman and his mentor Silvan Tomkins. Although Ekman initially rejected
Darwin’s theory of innate emotional expression – so much so that he did not even bother to read
The Expression of Emotion, dismissing the very idea altogether - his extensive anthropological
and psychological research eventually demonstrated that at least seven basic emotions are
universally expressed and understood across cultures, regardless of genetic background or
technological sophistication (see figure 4). They are: anger, disgust, contempt, fear,
happiness/enjoyment, sadness/distress, and surprise. This discovery and eventual widespread
acceptance by psychologists legitimized the new field of evolutionary psychology, Ekman’s
research drew on the foundation of his mentor, Silvan Tomkins, and involved cross-culture study
with remote tribes in Papa New Guinea, who had never seen a Westerner until Dr. Ekman
arrived. His study refuted the anthropological approach – that facial expressions are culturally
determined – in favor of the biological universality described by Darwin. The foundation of his
proof lay in the comparability between the responses of American college students and the tribal
people of New Guinea when shown photographs of facial expressions, and when asked to
express certain feelings with their faces. Each was able to identify the emotions expressed in
photos of the other subject, although they were unfamiliar with the culture of the other, and each
was able to convey emotion with a facial expression that was recognizable to the other [10]. The
scientific enquiry begun with Duchenne, James, and Darwin was finally heading towards
scientific fact.
Explaining his life’s work, Ekman said “I measure the movement of the facial muscles – you
cannot get a harder science – but I do it to study emotions. We cannot see an emotion; the facial
movement is just a display, but we can learn a lot if we measure that display precisely.”[11]
Ekman and his colleague Wallace V. Friesen undertook the construction of a taxonomy of facial
muscles and their actions, an exhaustive seven-year effort that entailed documenting the subtle
action of every muscle of the face, cataloging the intensity of each, as a measuring system for the
range of facial anatomy [12]. In 1978 they completed the Facial Action Coding System, or
FACS, which catalogues and encodes all possible facial muscle actions and their intensities, as
well as movements of the eyes and head. Each observable component of facial movement is
called an action unit or “AU,” which can be an individual muscle, part of it, or two or more
muscles than work in conjunction. All facial expressions can be decomposed into their
constituent AUs. The FACS lexicon is not itself an analysis of emotions, but simply a means of
objectively measuring and describing all possible expressions of the face.
8
Fig. 4 The seven universal emotions: anger, fear, contempt, disgust, happiness, sadness, and surprise. Source: Ekman, Paul. Unmasking the Face. Cambridge: Malor Books, 2003.
FACS coding, now the standard means of measuring facial movement, has been the foundation
of several software applications. Learning FACS coding well enough to pass Ekman’s
certification exam requires months of study, drill, and practice, a process that his own Web site
describes as “tedious.” Worldwide, only a few thousand people have attained FACS certification.
At minimum, self-instruction to pass the certification exam requires about 100 hours, but
frequently more hours of study are needed, and many take months to complete the training.
Though FACS has become the standard for the scientific study of facial expressions, mastering it
is impractical for the average layperson.
Though FACS comprehensively describes all possible actions of the facial muscles and their
many combinations, most of them do not relate to emotion. There are many voluntary facial
expressions: conversational punctuators, used when describing emotions felt in the past; listening
cues, such as a look of interest or questioning glance; and nonsense expressions – sticking out
the tongue, crossing the eyes, and so on. A smaller subset of the full collection of all possible
facial expressions are involuntary expressions of spontaneous emotion, and these can also be
combined. In fact, blended emotions are more common than singular ones. Ekman and his team
ultimately developed a system for the identification of expressions of emotion called “FACE”
(Facial Expression.Awareness. Compassion.Emotions.), but FACE still requires a complete
mastery of FACS (to accurately describe an expression from a purely biomechanical perspective)
before attempting to interpret the subset of facial actions that relate to emotion.
Ekman claims that with sufficient training, his system of human analysis using FACS, FACE,
and voice and speech patterns, is up to 90% accurate in identifying emotion, and more
significantly, in detecting deception. Although we are good at interpreting sincerely felt emotions
in others, most people, even those with FACS training, perceive deception only 52% of the time
9
– slightly better than pure chance. It is one thing to distinguish a false smile from a sincere one,
and most people can do so intuitively. The consistent ability to recognize lies based on facial
expressions is a rare skill. The so-called “Truth Wizard” study by Ekman and Maureen
O’Sullivan found only 50 people among a test pool of 20,000 who could detect deception with
greater than 80% accuracy. That represents only .25% of the test population [13]. (Deception
detection is one area where computer software may be more effective than humans.) As a
cooperative, social species, distrusting everyone would make life virtually impossible. Perhaps
we have evolved to not detect deception, just as we have evolved to recognize sincerely
expressed emotion. As interesting as deception detection is, the focus of our research is limited
to unmasked emotions sincerely expressed, which are, in fact, a common unspoken language.
1.2 Affective computing
Can emotional intelligence be reduced to an algorithm? Ekman and his team say yes, the
expression of basic emotion can indeed be reduced to an algorithm, and a fairly simple one at
that. In FACS encoding, anger, for example, is expressed by a simple statement: AU 4+5+7+23.
Each number represents a single movement of a facial muscle or group of muscles: #4 is the
Brow Lowerer, and is an action of the depressor glabellae muscle, which is located between the
eyebrows. The other AU’s that constitute anger are the Upper Lid Raiser (#5), Lid Tightener
(#7), and Lip Tightener (#23). Using the FACS system, emotional expression can be represented
as an algorithm, as a formal language, as a set, as a lookup table, or many other data structures.
Emotional expression can also be parsed and decoded algorithmically. That software programs
may become as accurate as highly trained human decoders is not impossible to fathom, but the
field is too new for this to be demonstrated with enough scientific rigor to be accepted fact.
Software applications such as the one designed by Paul Ekman and Dimitris Metaxas claim 80%
accuracy in detecting deception, and some developers claim even higher success rates, but third-
party testing is still lacking [15].
Ekman’s research has given rise to a new subfield of pathognomy, called “affective computing,”
pioneered by Rosalind Picard. Affective computing is concerned with artificial intelligence, and
with the notion that it is desirable that computers simulate human emotion beyond the superficial
level of video game designers and computer artists. It uses computers to detect human emotion;
it is also seeks to make computers simulate human emotional expressions more realistically, as
well further emotional human-computer interaction (HCI). There are several computing
languages designed to express emotion and emotional actions. Virtual Human Markup
Language, or VHML, is an XML-based language used to describe emotion in HCI, triggering
appearance and action of a “virtual human.” Robotics is another field concerned with emotional
expression and HCI. Sophisticated androids, such as the Actroid, mimic human emotion to a
startling degree. But the artistic interest in facial expressions and computing is not very different
from that of Renaissance painters who wanted to convey emotion in their works – more of a
creative endeavor than a psychological or computational one. Affective computing involves the
ability of humans to understand and relate to simulated facial expressions of emotion, and the
ability of machines to also interpret and interact with humans based on parsing and analyzing
human expressions of emotions.
Using the FACS system, Picard developed the Emotional-Social Intelligence Prosthesis (ESP) to
aid autistics in understanding their own facial expressions and interpreting the expressions of
others. This ability, which comes so naturally to most people, can be difficult for those with
autism and other related conditions:
10
People who have suffered brain damage may not be able to smile when asked to but will
still involuntarily smile at a joke. Conversely, patients suffering from Parkinson’s disease
… may be able to turn up the corners of the mouth when asked to smile but after getting a
joke may lack the ability to smile as a natural, automatic response. Clearly, the pathways
for smiling are quite elaborate, with both unconscious and conscious connections that
receive inputs from different parts of the brain. [16]
The ESP is a camera that films the wearer’s face and runs the data through a computer program
that analyzes the actions of the brow, eye, and mouth, and projects a graph indicating the
wearer’s projected emotions. Among ordinary non-autistic wearers, the ESP has an accuracy rate
of 65%, although when the ESP wearers were trained actors skilled at manipulating their faces
into universally-recognized expressions, its accuracy rate jumps to 85% [17] The technology is
still in development, but it is already demonstrating life-changing possibilities for autistic
children, helping them to accurately express their feelings as well as providing warning cues to
parents or teachers before an autistic child reaches a “meltdown” point. Picard’s ESP device may
also help facial paralysis suffers, whose inability to express their own emotions (and by
extension, interpret the expressions of others) inhibits their quality of life.
Although most of us are equipped with the ability to detect openly expressed emotions,
microexpressions are considerably more difficult to perceive because they are so fleeting, lasting
less than 1/15 of a second. Microexpressions reveal a great deal, although they are too quick for
untrained human eyes to detect. Manual analysis of microexpressions is labor-intensive and
painstakingly slow, due to their fast onset and offset and fleeting existence. "It takes about one
hour to score one minute of tape," explained Marian S. Bartlett, Salk postdoctoral researcher and
first author of a study measuring facial expressions using computer image analysis. And the
scorer must be proficient in FACS decoding, which takes many hundreds of hours of training.
"Our [computer] program, on the other hand, can do a minute of tape in about five minutes, and
once we optimize the program it will run in near real-time.” [14] Compared to human decoders,
programs will always have the advantage of speed, accuracy, and ease of analysis of fleeting
microexpressions.
The intelligence community is particularly interested in any technology that can detect deception
and predict dangerous behavior. TSA behavior detection screeners, using a training program
called Screening Passengers by Observation Technique (SPOT), have pulled hundreds of people
from airport lines for questioning. A handful have been charged, generally because of
immigration matters, outstanding warrants, or forged documents. As an anti-terrorism measure,
SPOT hasn’t been especially successful, and has generated at least one lawsuit [18]. SPOT
screeners are given four days of training in perceiving microexpressions – a trifling amount
compared to the hundreds of hours of study that Paul Ekman recommends to fully master FACS
decoding. Nevertheless, the program employs 3,000 officers at 161 airports in the US, and will
probably continue to grow.
Computerized lie detectors are an especially burgeoning field within affective computing. The
notion that liars can be exposed with a few mouse clicks is an appealing one, and this particular
type of application may be more commercially viable than the applications designed to help
autistics or facial paralysis sufferers described earlier. Ekman is currently designing a visual lie-
detector that will use video cameras and computers to capture and analyze data from human
expressions and gauge the truthfulness of the subject in real time. Researchers at Manchester
11
Metropolitan University have developed a lie detector called “Silent Talker” that they claim is
more than 80% accurate in detection deception, exceeding any other existing system, including
polygraphs. Other commercial and scientific applications such as Noldus FaceReader, Third
Sight EmoVision, and Affectiva (Picard’s commercial application) are used to predict unsafe
driving behavior, to distinguish buyers from “browsers” in retail settings, to measure audience
enjoyment of film trailers, and to alert store managers of sales clerks who are not smiling at their
customers. Ekman also markets FACE as communication training for personal and professional
enhancement. There is a vibrant market for software that can “read” emotional expressions, and
especially software that can distinguish deception from sincerity, perhaps spurred by the fact that
so few humans have the ability to detect lies. Nevertheless, these automated programs have yet
to yield truly impressive results. They are, nevertheless, demonstrably better than untrained
humans, and much better at detecting deception than even trained humans like the SPOT
screeners.
1.3 Resolution of the universality debate
Freud’s legacy and the popularity of cultural relativism pushed aside Darwin’s theory of
evolutionary behavior for nearly a century, but Paul Ekman’s research has spurred many
additional studies since the 1960s. "The universality of facial expressions of emotion is no longer
debated in psychology," says nonverbal behavior expert David Mastumoto [19]. Mastumoto led
a study at San Francisco State University Psychology, comparing the facial expressions of
sighted and blind judo athletes at the 2004 Summer Olympics and Paralympic Games. More than
4,800 photographs were captured and analyzed, including images of athletes from 23 countries.
The study proved that sighted and blind individuals use the same facial expressions, producing
the same facial muscle movements in response to winning and losing. Matsumoto said:
"Individuals blind from birth could not have learned to control their emotions in this way through
visual learning so there must be another mechanism. It could be that our emotions, and the
systems to regulate them, are vestiges of our evolutionary ancestry.” [20]
The blind and sighted judo player study demonstrated that emotional expression is not culturally
acquired, but rather innate. It also further proved the universality of emotional expression across
the cultures of 23 countries who participated in the Olympics and Paralympics. Another study
compared the facial expressions of blind people with those of their sighted relatives:
When the researchers compared the results, they discovered that even though the blind
volunteers had never seen their relatives' faces before, their facial expressions were
extremely alike. Lead researcher Gili Peleg, from the Instiatute of Evolution at the
University of Haifa, said: "We have found that facial expressions are typical to families -
a kind of facial expression 'signature'." She said her results suggested that facial
expressions were inherited and therefore had an evolutionary basis [21].
Certainly it is the prevailing view of Ekman, Matsumoto, Peleg, and other evolutionary
psychologists that we do not learn to express these basic emotions in the same way we learn to
voluntarily convey our thoughts through language. Unlike language, the facial expressions of
these basic emotions are impossible to suppress when the emotion is intensely felt, and difficult
to mimic in the absence of emotion. Yet electrical stimulation of the left and/or right amygdala
can evoke not only the facial expressions, but also the feelings, of fear, anxiety, sadness, disgust,
and happiness in test subjects, despite the absence of all other stimuli [22]. It may prove
impossible to separate the expression of emotion from the emotion itself. Thanks to Ekman’s
12
research and to advances in neuroscience, medical imaging and other technologies, the
mechanics of emotional facial expressions can be approached with the same precise taxonomies
as other muscular-skeletal or biomechanical aspects of human physiology.
13
II. General Approach to the Problem of Evaluation
2.1 FACS, Artanatomy, and commercial software
There are commercial software products that analyze emotion in photographs and accurately
report not only the emotions but also the presence of deception – insincere expressions of
emotions concealed emotions, or masking a sincere emotion with a false one. FACS coding
allows for extreme precision of analysis and reporting to the user. Input is not required, other
than the uploading of a photo. Recognition algorithms compare the appearance of each AU in the
photo to the FACS lexicon until the entire face is mapped and the corresponding emotions
identified. Such software is highly sophisticated, but since the goal of my investigation was not
to accurately report every facial possible facial expression, but rather to report on the presence or
absence of just seven universal emotions, a much simpler approach was more practically suited
to the task.
Any application that aims to identify emotions in facial expressions must have a lexicon for
comparison with the input. The seven universally defined emotions are represented by a set of
biomechanical actions that do not vary across cultures. These actions form a lexicon of facial
expressions and the emotions to which they correspond.
I chose to develop my own lexicon of facial actions relating to the seven universal expressions
for use in a computer application to identify facial expressions in photographs. I checked and
cross-checked Ekman’s statements about the manifestation of emotion in his book Unmasking
the Face against the work of two other prominent researchers in the field: Victoria Contra
Flores’ Artnatomy and David Givens’ Nonverbal Dictionary. The lexicon is primarily based on
the research of these three, although Ekman’s work is the foundation of it, as he is inarguably the
pioneer of the field.
Why not use FACS, if it is the industry standard, instead of developing my own lexicon? There
is no question but that FACS is the standard for the measurement of facial movement – it is used
in fields ranging medicine, psychology, intelligence and law enforcement, acting, fine arts, and
various commercial enterprises. It is an exhaustive description of facial behavior - the Action
Units, their combinations, those that are mutually exclusive, and the intensity of each muscular
action on a five-point scale. FACS is an extremely comprehensive classification system that
categorizes all possible muscular actions of the human face, not merely those involved in the
seven universal emotions. According to David Matsumoto, “Of the literally thousands of
expressions that can possibly be produced, the facial configurations associated with discrete
emotional states represent a relatively small set of specific combinations of the available
repertoire.” [23] My intention was to simplify the analysis of the subset of emotional facial
expressions in photographs. The two programs demonstrate that seven basic emotions can be
modeled and manipulated using the set and relational data structures, and report to the user the
presence and intensity of each emotion, without 100+ hours of training to become a certified
FACS decoder.
A considerable amount of anatomical study was required to learn the actions of the facial
muscles and the emotions expressed by them. I referred to artists’ resources to gain a better
understanding of facial anatomy, particularly an online application called Artnatomy by Victoria
Contreras Flores (see figure 6). Artanatomy is an application designed to help artists and
animators understand emotional expression. With that foundation of anatomical understanding, I
14
learned to interpret expression and correlate it to the universal emotions of anger, contempt, fear,
sadness, disgust, surprise, and happiness. For that purpose, I relied on Ekman’s comprehensive
book, Unmasking the Human Face. David Givens’ Nonverbal Dictionary, which contains A-Z
entries of a vast array of facial expressions as well as other nonverbal cues, was my third source
for compiling the lexicon.
To demonstrate my approach to creating the lexicon, consider the expression of contempt, which
I use an example because of its brevity - it has relatively few facial cues as compared to the other
six universal expressions of emotion. Contempt is described in Ekman’s Unmasking the Face as
an asymmetrical expression: “A slight pressing of the lips and raising of the corners on one
side,” and in its more intense appearance: “The upper lip raised on one side, exposing the teeth…
a milder form of contempt is a barely noticeable lifting of the upper lip on one side.” Ekman
cites the asymmetrical smirk or sneer as the hallmark of a contemptuous expression; this
particular action of the mouth is a much stronger indicator of the presence of contempt than any
other actions he describes as associated with contempt (a symmetrical dilation of the nostril(s),
asymmetrical dilation of the nostrils, and/or an upward gaze or appearance of looking “down” at
the object of contempt). Likewise Flores’ Artnatomy illustrates an expression of “scorn,” – this is
synonymous with “contempt.” Flores provides a graphic representation of the appearance of
scorn on the face: the nostrils are dilated asymmetrically and the upper lip is raised on one side
only. Artnatomy’s animation illustrates the onset and offset of facial actions, but it does not show
contempt in its milder form with only a slight asymmetry – it shows only the maximum intensity
of emotions, and so the canine tooth is visible on the left side of the mouth of the contempt
illustration, and the nostril is also dilated on that side. Since Artnatomy only depicts muscles of
the face, it also does not show the head title/upward gaze Ekman associates with contempt,
because movement of the head is effected by the neck rather than facial muscles. The limitations
of Flores’ graphic representation do not negate Ekman’s assessment of the upward gaze as an
indication of contempt. Likewise the milder forms of contempt expressions are not necessarily
invalidated by Artnatomy merely because it does not provide illustrations of them.
David Givens’ Nonverbal Dictionary describes the head tilt as a hallmark of a contemptuous
sneer, citing psychologist Carroll Izard’s 1971 book The Face of Emotion:
Head-tilt-back may be accompanied by "contempt-scorn" cues: one eyebrow lifts higher
than the other, the eye openings narrow, the mouth corners depress, the lower lip raises
and slightly protrudes, and one side of the upper lip may curl up in a sneer (Izard
1971:245).
The Nonverbal Dictionary further describes the contempt expression:
Sneer. In the sneer, buccinator muscles (innervated by lower buccal branches of the
facial nerve) contract to draw the lip corners sideward to produce a sneering "dimple" in
the cheeks (the sneer may also be accompanied by a scornful, upward eye-roll). From
videotape studies of nearly 700 married couples in sessions discussing their emotional
relationships with each other, University of Washington psychologist, John Gottman has
found the sneer expression (even fleeting episodes of the cue) to be a "potent signal" for
predicting the likelihood of future marital disintegration (Bates and Cleese 2001). In this
regard, the sneer may be decoded as an unconscious sign of contempt.
15
The three sources differ in format - Ekman’s is an illustrated book, Flores’ an online animated
tool, and Givens’ a dictionary with extensive citations of existing research and many cross-
references. The three authors have differing focus: Ekman is a psychologist, Flores an artist, and
Givens an anthropologist. Overall, they are in agreement as the appearance of contempt, with
some providing additional detail that others do not. My intention in using three different
unrelated sources for my own lexicon was not to validate any of their research against another,
but to provide a broader and more comprehensive description of each emotional expression by
surveying the field.
2.2 Expressions as sets
Although the expressions of the seven basic emotions are so simple that I might have chosen a
wide selection of data models, in essence, a particular emotional expression is easily categorized
as a subset of all expressions:
All possible facial expressions ⊃ expressions of emotion ⊃ expressions of basic emotions ⊃ an expression of a particular basic emotion
I decided to treat the universal emotions and their expressions as sets, in the mathematical sense
of the term. The next step was cataloging all the facial actions related to the each of the universal
expressions. Once I had acquired an adequate understanding of emotional expression and could
score well on the pictorial quizzes in Ekman’s book, I wrote my lexicon. I catalogued the
universal emotions using the following format:
Areas of the Face
Brow, eyes, nose, mouth
Facial Muscles (Ekman’s Action Units)
Latin names
Most are bilateral with left and right muscles
Facial Actions
Each muscle has between one and five actions.
Some actions are mutually exclusive (e.g., frowning and smiling).
Actions of Universal Expressions
anger | contempt | disgust | fear | happiness | sadness | surprise |
neutral or inconclusive (absence of expression)
Each of the universal expressions has a unique ‘layout’ of facial
actions
16
Fig. 5 The Artanatomy depiction of anger. Source: Artanatomy by Victoria Contreras Flores CC:BY
2.3 A lexicon of expressions of basic emotions
For my initial catalog, I divided the face into left and right regions. The muscles that perform the
action are referred to by their Latin or medical names. Below is my description of the actions
associated with anger, drawing from not only Artanatomy but also Paul Ekman’s descriptions of
the facial actions of universal emotions in Unmasking the Face. In the expression of anger, both
the lower and upper eyelids tighten as the brows lower and draw together. The jaw thrusts
forward, the lips press together, and the lower lip may push up a little. Intense anger raises the
upper eyelids as well. Some of the actions are mutually exclusive, indicated by XOR, because
that particular anatomical region cannot perform both actions at once. For example, eyelids
cannot be simultaneously in the raised position and the lowered position. Parentheses further
define mutually exclusive actions. Certain actions can happen simultaneously, although they
don’t necessarily all have to be present for the emotion to be expressed. Those are indicated by ||.
Anger:
brow:
corrugatorLeft ≡→ draw brow inward || lower brow || wrinkle forehead vertically
corrugatorRight ≡ draw brow inward || lower brow || wrinkle forehead vertically
orbicularisOculiLeft ≡ “flashbulb eyes” XOR ((widen XOR narrow) XOR close) || crow’s feet ||
(downward gaze XOR upward or away gaze)
eyes:
orbicularisOculiRight ≡ (“flashbulb eyes” XOR ((widen XOR narrow) XOR close) || crow’s feet
|| (downward gaze XOR upward or away gaze)
nose:
procerus ≡ wrinkles above bridge of nose
17
noseExpanderLeft ≡ flare nostrils
noseExpanderRight ≡ flare nostrils
mouth:
quadratiLabiiSuperiorLeft ≡ raise upper lip and corner of left nostril (nasolabial fold)
quadratiLabiiSuperiorRight ≡ raise upper lip and corner of right nostril (nasolabial fold)
masseterLeft≡ (wide-open mouth (laughing, chewing, shouting, etc) XOR clenched jaw)
masseterRight≡ (wide-open mouth (laughing, chewing, shouting, etc) XOR || clenched jaw)
quadratisLabiiInferiorLeft ≡ depress || extend lower lip
quadratisLabiiInferiorRight ≡ depress || extend lower lip
triangularisLeft ≡ draw the mouth downward (frown)
triangularisRight ≡ draw the mouth downward (frown)
mentalisLeft ≡ raise lower lip XOR wrinkle chin (pout)
mentalisRight ≡ raise lower lip XOR wrinkle chin (pout)
platysmaLeft ≡ draw the lower lip and corner of the mouth sideways and down, partially opening
the mouth
platysmaRight ≡ draw the lower lip and corner of the mouth sideways and down, partially
opening the mouth
orbicularisOrisLeft ≡ compress XOR (purse XOR part round-shaped) XOR part rectangular-
shaped)
orbicularisOrisRight ≡(compress XOR (purse XOR part round-shaped) XOR part rectangular-
shaped)
I then further consolidated this list to use the following naming convention
Region_muscle_action_side
So that under “brow” in the above list, where I had described
corrugatorLeft ≡ draw brow inward || lower brow || wrinkle forehead vertically ,
I re-named as three separate actions, using the “b_” prefix to indicate the brow region and the
“_L” suffix to indicates the left side of the face
b_corrugator_drawInward_L
b_corrugator_lower_L
b_corrugator_wrinkleVertical_L
As with the brow and the left side of the face, I used “e_” to indicate the eye, “n_” for nose, and
“m_ for the mouth, and “_R” to indicate the right side of the face. So the set of facial actions
expressing anger is as follows:
b_corrugator_L_lower
b_corrugator_L_drawInward
b_corrugator_L_wrinkleVertical
b_corrugator_R_lower
b_corrugator_R_drawInward
b_corrugator_R_wrinkleVertical
e_orbicularisOculi_L_flashbulb
18
e_orbicularisOculi_R_flashbulb
e_orbicularisOculi_L_narrow
e_orbicularisOculi_R_narrow
n_procerus_wrinklesAboveBridge
n_noseExpander_L_flareNostril
n_noseExpander_R_flareNostril
m_quadratiLabiiSuperior_L_raiseUpperLipNasolabial
m_quadratiLabiiSuperior_R_raiseUpperLipNasolabial
m_masseter_L_openMouthWide
m_masseter_R_openMouthWide
m_masseter_L_clenchJaw
m_masseter_R_clenchJaw
m_quadratisLabiiInferior_L_depressLowerLip
m_quadratisLabiiInferior_R_depressLowerLip
m_quadratisLabiiInferior_L_extendLowerLip
m_quadratisLabiiInferior_R_extendLowerLip
m_triangularis_L_frown
m_triangularis_R_frown
m_mentalis_L_raiseLowerLip
m_mentalis_R_raiseLowerLip
m_mentalis_L_wrinkleChinPout
m_mentalis_R_wrinkleChinPout
m_platysma_L_draw LowerLipSideDown
m_platysma_R_draw LowerLipSideDown
m_orbicularisOris_L_compress
m_orbicularisOris_R_compress
m_orbicularisOris_L_partRectangle
m_orbicularisOris_R_partRectangle
Appendix A contains the complete set of facial actions and reliable indicators for each of the
seven universal emotions.
19
III. SETL, a Procedural Approach
3.1 SETL analysis with set operations
Once I had a set of facial actions for each emotion, my next consideration was a programming
environment that would support the set data model and set theoretic operations. Many different
languages fulfill this role. Almost any language with data structures such as sets, tables, arrays,
or other types of collections would have sufficed. I chose to begin my investigation with a
language called SETL, based on the mathematical theory of sets. It seemed that a language
designed specifically around the notion of a set and its operations would be particularly well-
suited for our purposes here, although in retrospect, the language lacked other key features that
were necessary to create a user-friendly application. Nevertheless, I began with a description of
the problem: how my program might analyze facial expressions. Since FACS decoding video
footage is so vastly complex and time-consuming, I chose to use still photographs for the input,
prompting the user for only the simplest of information about the photograph. The process can be
summarized in the following steps:
1. Accept user input about the photo
2. Store input as a set
3. Compare input to lexicon sets
4. Output list of detected emotions and the corresponding tallied weight for each
The lexicon sets were those I had created by reviewing Artanatomy and the works of Paul
Ekman. The set of facial actions to express anger, in SETL syntax, is represented as:
anger := {'b_corrugator_L_lower',
'b_corrugator_L_drawInward',
'b_corrugator_L_wrinkleVertical',
'b_corrugator_R_lower',
'b_corrugator_R_drawInward',
'b_corrugator_R_wrinkleVertical',
'e_orbicularisOculi_L_flashbulb',
'e_orbicularisOculi_R_flashbulb',
'e_orbicularisOculi_L_narrow',
'e_orbicularisOculi_R_narrow',
'n_procerus_wrinklesAboveBridge',
'n_noseExpander_L_flareNostril',
'n_noseExpander_R_flareNostril',
'm_quadratiLabiiSuperior_L_raiseUpperLipNasolabial',
'm_quadratiLabiiSuperior_R_raiseUpperLipNasolabial',
'm_masseter_L_openMouthWide',
'm_masseter_R_openMouthWide',
'm_masseter_L_clenchJaw',
'm_masseter_R_clenchJaw',
'm_quadratisLabiiInferior_L_depressLowerLip',
'm_quadratisLabiiInferior_R_depressLowerLip',
'm_quadratisLabiiInferior_L_extendLowerLip',
'm_quadratisLabiiInferior_R_extendLowerLip',
'm_triangularis_L_frown',
20
'm_triangularis_R_frown',
'm_mentalis_L_raiseLowerLip',
'm_mentalis_R_raiseLowerLip',
'm_mentalis_L_wrinkleChinPout',
'm_mentalis_R_wrinkleChinPout',
'm_platysma_L_draw LowerLipSideDown',
'm_platysma_R_draw LowerLipSideDown',
'm_orbicularisOris_L_compress',
'm_orbicularisOris_R_compress',
'm_orbicularisOris_L_partRectangle',
'm_orbicularisOris_R_partRectangle' };
Further, I identified a subset of anger composed of actions that most reliably indicate anger –
those that are hardest to suppress, most difficult to mimic:
angerReliable := { 'e_orbicularisOculi_L_flashbulb',
'e_orbicularisOculi_R_flashbulb',
'm_orbicularisOris_L_compress',
'm_orbicularisOris_R_compress' };
I did the same for the other expressions of universal emotions, except disgust and surprise.
Disgust does not contain a reliable subset that strongly indicates the presence of that emotion,
whereas the reliable indicator for surprise is the fast onset and offset of the expression, which is
not detectable in still photographs.
3.2 Testing input and scoring results
SETL applications run from the command line, which makes entering input a more arduous
process for the user than using a GUI or browser-based Web form. With correct input, the SETL
program does analyze emotions indicated by any set of facial actions and reports on the presence
of the most reliable indicators of each emotion. Below is a sample of SETL dialogue with the
user:
'enter some csv values for the left side of the face:'
b_corrugator_L_lower,b_corrugator_drawInward,b_corrugator_wrinkleVertical,b_corrugatorow
er,b_corrugator_drawInward,b_corrugator_wrinkleVertical,e_orbicularisOculi_flashbulb,e_orbic
ularisOculi_flashbulb,e_orbicularisOculi_narrow,e_orbicularisOculi_narrow,n_procerus_wrinkle
sAboveBridge,n_noseExpander_flareNostril,n_noseExpander_flareNostril,m_quadratiLabiiSupe
rioraiseUpperLipNasolabial,m_quadratiLabiiSuperioraiseUpperLipNasolabial,m_masseter_open
MouthWide,m_masseter_openMouthWide,m_masseter_clenchJaw,m_masseter_clenchJaw,m_q
uadratisLabiiInferior_depressLowerLip,m_quadratisLabiiInferior_depressLowerLip,m_quadratis
LabiiInferior_extendLowerLip,m_quadratisLabiiInferior_extendLowerLip,m_triangularis_frown
,m_triangularis_frown,m_mentalisaiseLowerLip,m_mentalisaiseLowerLip,m_mentalis_wrinkle
ChinPout,m_mentalis_wrinkleChinPout,m_platysma_drawLowerLipSideDown,m_platysma_dra
wLowerLipSideDown,m_orbicularisOris_compress,m_orbicularisOris_compress,m_orbicularis
Oris_partRectangle,m_orbicularisOris_partRectangle
'is the right side symmetrical to the left - Y/N ?:' y
21
'Testing for anger'
'Input is contained in the anger set. Input contains 30 of the 36 elements of anger.'
'There is a reliable set for anger. Input contains 4 of the 4 reliable elements of anger.'
'Testing for contempt'
'Input is contained in the contempt set. Input contains 2 of the 10 elements of contempt.'
'There is a reliable set for contempt. Input contains 0 of the 2 reliable elements of contempt.'
'Testing for disgust'
'Input is contained in the disgust set. Input contains 8 of the 22 elements of disgust.'
'There is no reliable set for disgust'
'Testing for enjoyment'
'Input is contained in the enjoyment set. Input contains 6 of the 12 elements of enjoyment.'
'There is a reliable set for enjoyment. Input contains 0 of the 3 reliable elements of enjoyment.'
'Testing for fear'
'Input is contained in the fear set. Input contains 6 of the 15 elements of fear.'
'There is a reliable set for fear. Input contains 4 of the 6 reliable elements of fear.'
'Testing for sadness'
'Input is contained in the sadness set. Input contains 13 of the 25 elements of sadness.'
'There is a reliable set for sadness. Input contains 2 of the 4 reliable elements of sadness.'
'Testing for surprise'
'Input is contained in the surprise set. Input contains 4 of the 12 elements of surprise.'
'There is no reliable set for surprise'
With 30 of the 36 element of anger, and 4 of the 4 reliable elements of anger, the SETL program
correctly matched the inputted actions with the emotion that they express. (See Appendix B for
the complete program).
3.3. User considerations
While SETL, as its name implies, is useful for set operations, it is far from user-friendly. Given
the tediousness of entering lengthy Latin anatomical names onto the command line, it quickly
became apparent that a SETL application could never be easy for an uninformed user. With
further tweaking, the SETL application could generate more detailed reports, listing the emotions
present in order of intensity, as percentages of a whole, or possibly even extend the reliable
indicators and analyze the presence of deception. Yet there was no working around the fact that
SETL is not easy to use for the naive audience for which my application was intended.
22
IV. SQL, a Relational Approach
4.1 The table
The SETL output made it clear that users would benefit from an easy-to-use input form and
orderly reporting of the emotions present, given that blended emotions are the most common
expressions. I decided to further refine the lexicon I had created for the SETL program,
organizing the data into a set of relational tables. This allowed me to give each facial action a
short label rather than using the Latin names of the muscles, and in lieu of the reliable indicators,
I organized the actions into a system of weighted averages. The weighted averages were based
on the SETL reliable sets with one significant difference: there was no longer a need to divide
the face into left and right regions to indicate the asymmetry of expressions of contempt and
disgust. The SQL table allowed for plain English descriptions in the action column. For the
expression of contempt, expressed in SETL as sets of anatomical actions:
contemptCanineLeft := {'m_caninus_raiseUpperLipCanine_L'};
contemptCanineRight := {'m_caninus_raiseUpperLipCanine_R'};
contemptLeft := {'m_caninus_raiseUpperLipCanine_L','n_noseExpander_flareNostril_L'};
contemptLeftME :=
{'n_noseExpander_flareNostril_R','m_caninus_raiseUpperLipCanine_R'};
contemptRight := {'n_noseExpander_flareNostril_R','m_caninus_raiseUpperLipCanine_R'};
contemptRightME :=
{'m_caninus_raiseUpperLipCanine_L','n_noseExpander_flareNostril_L'};
contempt:=
{'n_procerus_wrinklesAboveBridge_L','n_procerus_wrinklesAboveBridge_R','e_orbicularis
Oculi_downwardGaze_R','e_orbicularisOculi_downwardGaze_L','e_orbicularisOculi_
upwardAwayGaze_L','e_orbicularisOculi_upwardAwayGaze_R','m_quadratiLabiiSuperior_r
aiseUpperLipNasolabial_L','m_quadratiLabiiSuperior_raiseUpperLipNasolabial_R'
,{contemptLeft},{contemptRight}};
$ symmetric difference - exclusive or - for contempt's unilateral reliable indicator
contemptReliable := contemptCanineLeft mod contemptCanineRight;
I replaced the set for contempt with four rows in an SQL Server table, written in plain English
rather than an anatomical naming convention. The same reliable indicator was assigned the
heaviest weight, and is expressed as a single row since there is a single element in the
contemptReliable set described above:
emotion region action weight
contempt lips lips pressed together and outer corner visibly raised on one side or the other (not both), possibly exposing the canine tooth
0.37
contempt eyes upward or away gaze with head tilted 0.27
contempt nose nostril visibly raised on one side or the other (not both) 0.27
contempt nose dilated nostrils 0.09
23
More reliable indicators – the same have a heavier weight, for better scoring of the intensity and
reliability of each emotion. My intention was again to compare the user-inputted data about the
facial expression with the lexicon of emotional expressions I built, detect emotion, and output
descriptions of their presence and intensity.
The revised table for all seven emotions and their actions was much more consolidated than the
lengthy SETL sets containing long Latin names:
emotion region action weight label
anger brow lowered brow drawn inward 0.13 b6
anger brow vertical lines between brows 0.09 b1
anger eyes glaring or narrowed eyes 0.17 e10
anger eyes lower eyelid tensed 0.17 e9
anger eyes tensed lids upper lids 0.09 e5
anger eyes tensed upper lids with upper lid lowered and covering part of iris 0.09 e4
anger mouth mouth open in horizontal shape, as if shouting 0.13 m9
anger mouth narrowed lips pressed together 0.09 m6
anger nose dilated nostrils 0.03 n1
contempt eyes upward or away gaze with head tilted 0.27 e12
contempt mouth lips pressed together and outer corner visibly raised on one side or the other (not both), possibly exposing the canine tooth
0.37 m16
contempt nose nostril visibly raised on one side or the other (not both) 0.27 n4
contempt nose dilated nostrils 0.09 n1
disgust brow lowered brow 0.03 b2
disgust brow vertical lines between brows 0.03 b1
disgust eyes lower lid raised, but not tense, narrowing eyes, with lines beneath the eye
0.08 e3
disgust mouth lower lip lowered and pushed out, exposing the teeth and tongue
0.15 m11
disgust mouth upper lip raised very high and close to the nose 0.15 m10
disgust mouth mouth open and parted with lower lip raised 0.11 m7
disgust mouth cheeks raised with visible naso-labial fold from nostril to outer corner of mouth
0.08 m5
disgust mouth upper lip moderately raised 0.08 m5
disgust mouth lower lip lowered and pushed out, exposing the teeth 0.05 m2
disgust mouth lower lip raised and pushed up near upper lip 0.05 m1
disgust nose extreme wrinkles across sides and bridge of nose 0.11 n3
disgust nose moderate wrinkles across sides and bridge of nose 0.08 n2
fear brow brow raised and with inner corners drawn together 0.31 b8
fear brow horizontal wrinkles across center only of forehead 0.08 b5
fear eyes upper lid raised and lower lid tensed 0.23 e11
fear eyes whites visible above the iris, or above and below iris 0.08 e2
fear mouth lips stretched and tense with corners drawn back 0.15 m12
fear mouth rectangular open, tense mouth 0.15 m10
happiness eyes eyebrow and eye cover fold slightly lowered, narrowing the eye 0.27 e12
happiness eyes 'crow's feet' wrinkles at outer edges 0.10 e7
happiness eyes wrinkles or 'bags' beneath lower lids 0.10 e6
happiness eyes lower lids raised but not tense 0.05 e1
24
happiness mouth cheeks raised with visible naso-labial fold from nostril to outer corner of mouth 0.22 m5
happiness mouth cheeks raised with deep naso-labial fold from nostril to outer corner of mouth 0.16 m13
happiness mouth corners of lips drawn up and back (smile) 0.05 m3
sadness brow brow lowered with inner corners drawn together and up 0.20 b7
sadness brow vertical lines between brows 0.20 b1
sadness eyes upper lid, inward corner raised, giving a triangular shape to eye 0.27 e12
sadness eyes downward gaze 0.13 e8
sadness mouth corners drawn down 0.13 m8
sadness nose dilated nostrils 0.07 n1
surprise brow brows raised high and arched 0.07 b4
surprise brow horizontal wrinkles across entire forehead 0.07 b3
surprise eyes upper lid raised and lower lid relaxed 0.26 e11
surprise eyes whites visible above the iris, or above and below iris 0.07 e2
surprise mouth gaping, rounded, relaxed, dropped jaw 0.26 m15
surprise mouth moderate or wide open, rounded, relaxed mouth 0.20 m14
surprise mouth slightly open, rounded relaxed mouth 0.07 m4
4.2 The application
My approach was similar to the SETL program approach:
1. Accept user input
2. Compare input to lexicon tables
3. Output list of detected emotions and the corresponding tallied weight for each
Using SQL and ColdFusion, a server-side scripting language for the Web, I created a simple
Web form to prompt the user to select facial actions based on photographs – far simpler than
entering Latin names on the command line. An image of a neutral expression is provided for
comparison. The form does not inform the user as to which emotion is expressed by each facial
action, but merely prompts to choose a facial action best matched with the photo. Figure 6 is a
portion of the section of the form for the brow, including a set of actions that are mutually
exclusive. (See Appendix C for the complete form):
1) EYEBROWS
Look at the eyebrows and choose the action(s) that best represent the action depicted in the
photo.
Choose as many as apply (may also choose none):
expression neutral
vertical lines between brows
25
horizontal wrinkles across
entire forehead
horizontal wrinkles across
center only of forehead
Look at the eyebrows and choose the action that best represent the action depicted in the photo.
Choose only one (may also choose none):
expression neutral
lowered brow
lowered brow drawn inward
brow lowered with inner
corners drawn together and up
brows raised high and
arched
brow raised and with inner
corners drawn together
none of the above
Fig. 6 Web form to accept input for the analysis of facial expressions. Source: Ekman, Paul. Emotion in
the Human Face. Cambridge: Cambridge University Press, 1983 (photos); Artanatomy by Victoria
Contreras Flores CC:BY (illustrations)
Processing is expressed as a simple SQL query which uses the data passed by the ColdFusion
form:
SELECT sum(weight) "angerSum" //create a temporary variable
FROM emotion //the emotion table contains all the facial actions
WHERE [emotion] = 'anger'
AND label IN //checking for the fields that correspond to the actions of anger
('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
26
'#FORM.e1#',
'#FORM.e2#',
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
// and so on for the other six emotions
<CFSET angerWeight = getAngerWeight.angerSum>
Weighted score for each emotion:<BR><BR>
<cfoutput query="getAngerWeight">
<cfif angerWeight gt 0>
Anger: #getAngerWeight.angerSum# <BR>
<cfelse>
Anger: 0.00<BR>
</cfif>
</cfoutput>
// and so on for the other six emotions
Typical output of the form appears as:
Weighted score for each emotion:
Anger: 0.65
Disgust: 0.03
Fear: 0.15
Happiness: 0.00
Sadness: 0.47
Surprise: 0.00
27
Note that the fractions appearing above are interpreted as the intensity of each the emotion, not
as percentages of the whole expression. In other words, the report above indicates that anger is at
65% intensity (an expression of pure rage might be expected to report 100% anger). It does not
preclude a 47% intensity of sadness, so that the expression might be described in English as
largely angry, but with a fair amount of sadness and a trace of fear and disgust.
(See Appendix D for the complete program.)
4.3 Testing with naïve users
The output shown above, reporting .65% anger, is the successful analysis of a photograph that
illustrates anger, but I submitted the form myself, after months of studying facial expressions. A
better test of the program’s accuracy would come from uninformed users – laypersons with no
special knowledge of emotional facial expressions. Recruiting testers turned out to be a bit of a
challenge – getting people to fill out forms is never easy. I didn’t want people randomly filling it
out in haste just to be done with it, but instead to study the photograph and match its features to
the sample images provided. Since the program is intended to be used by someone who wants to
discover the emotion behind a facial expression, the entire process will not be as successful with
users who are lackadaisical. In the end, a total of eighteen laypersons tested the application. The
first eight were assigned images: four tested a photo of an angry expression, and another four
tested an anger/contempt blend. The remaining ten testers self-selected an expression from a set
displayed in a Web form before answering the questions about the appearance of the facial
features. To prevent too many submissions for one expression and two few for another, the form
dynamically removed an image from display if the database contained more than four tests,
forcing subsequent users to select an expression with fewer tests and ensuring each expression
was tested at least once.
4.4 Scoring
Below are the results of the 18 testers. The photographs used for testing are from Paul Ekman’s
Unmasking the Face. As described in the subsequent evaluation, if the emotion depicted in the
photo was reported as the predominant emotion after the testers submitted the form, then the
result was considered successful.
28
Testing using a photograph illustrating anger
Fig. 7 Expression of anger
Tester # 1
Anger: 0.35
Contempt: 0.00
Disgust: 0.14
Fear: 0.00
Happiness: 0.00
Sadness: 0.20
Surprise: 0.07
Testing using a photograph illustrating anger/contempt (blended emotion)
Fig. 8 Expression of anger/contempt
29
Tester #2
Anger: 0.22
Contempt: 0.37
Disgust: 0.00
Fear: 0.00
Happiness: 0.00
Sadness: 0.00
Surprise: 0.00
Tester #3
Anger: 0.35
Contempt: 0.09
Disgust: 0.16
Fear: 0.15
Happiness: 0.00
Sadness: 0.27
Surprise: 0.00
Tester #4
Anger: 0.52
Contempt: 0.09
Disgust: 0.11
Fear: 0.15
Happiness: 0.00
Sadness: 0.27
Surprise: 0.00
Tester #5
Anger: 0.30
Contempt: 0.09
Disgust: 0.06
Fear: 0.00
Happiness: 0.00
Sadness: 0.27
Surprise: 0.00
30
Testing using a photograph illustrating contempt
Fig. 9 Expression of contempt
Tester # 6
Anger: 0.13
Contempt: 0.46
Disgust: 0.11
Fear: 0.00
Happiness: 0.00
Sadness: 0.07
Surprise: 0.00
Testing using a photograph illustrating disgust
Fig. 10 Expression of disgust
31
Tester #7
Anger: 0.43
Contempt: 0.09
Disgust: 0.47
Fear: 0.23
Happiness: 0.38
Sadness: 0.00
Surprise: 0.33
Tester #8
Anger: 0.00
Contempt: 0.00
Disgust: 0.21
Fear: 0.00
Happiness: 0.00
Sadness: 0.13
Surprise: 0.00
Tester #9
Anger: 0.39
Contempt: 0.00
Disgust: 0.24
Fear: 0.00
Happiness: 0.33
Sadness: 0.13
Surprise: 0.00
Tester #10
Anger: 0.21
Contempt: 0.09
Disgust: 0.24
Fear: 0.00
Happiness: 0.39
Sadness: 0.67
Surprise: 0.00
32
Testing using a photograph illustrating fear
Fig. 11 Expression of fear
Tester # 11
Anger: 0.00
Contempt: 0.00
Disgust: 0.11
Fear: 0.46
Happiness: 0.00
Sadness: 0.20
Surprise: 0.14
Tester # 12
Anger: 0.00
Contempt: 0.00
Disgust: 0.08
Fear: 0.62
Happiness: 0.00
Sadness: 0.00
Surprise: 0.07
33
Testing using a photograph illustrating happiness
Fig. 12 Expression of happiness
Tester #13
Anger: 0.04
Contempt: 0.36
Disgust: 0.00
Fear: 0.00
Happiness: 0.50
Sadness: 0.07
Surprise: 0.00
Testing using a photograph illustrating sadness
Fig. 13 Expression of sadness
34
Tester # 14
Anger: 0.09
Contempt: 0.37
Disgust: 0.19
Fear: 0.00
Happiness: 0.00
Sadness: 0.67
Surprise: 0.00
Tester # 15
Anger: 0.09
Contempt: 0.00
Disgust: 0.03
Fear: 0.08
Happiness: 0.00
Sadness: 0.67
Surprise: 0.00
Tester # 16
Anger: 0.13
Contempt: 0.00
Disgust: 0.00
Fear: 0.23
Happiness: 0.06
Sadness: 0.00
Surprise: 0.00
Testing using a photograph illustrating surprise
Fig. 14 Expression of surprise
35
Tester # 17
Anger: 0.00
Contempt: 0.00
Disgust: 0.15
Fear: 0.00
Happiness: 0.00
Sadness: 0.00
Surprise: 0.40
Tester # 18
Anger: 0.00
Contempt: 0.00
Disgust: 0.00
Fear: 0.08
Happiness: 0.00
Sadness: 0.00
Surprise: 0.54
4.5 Evaluation of results
The aggregated results should be interpreted with consideration of the relatively few number of
testers for the all seven emotions, and even fewer for the individual emotions. 100% accuracy is
not as significant for a single tester as for a larger pool, but these percentages are included to
illustrate the differences between test submissions for each emotion.
single emotion total
testers
number of correct tests
percentage correct
anger 1 1 100%
contempt 1 1 100%
disgust 4 2 50%
fear 2 2 100%
happiness 1 1 100%
sadness 3 2 66.60%
surprise 2 2 100%
TOTALS 14 11 79%
blended emotion
total testers
number of
correct tests
percentage correct
partly correct -
anger predominant
percentage partly
correct
anger/contempt 4 1 25% 3 75%
Accuracy was best for single emotions; 11 of the 14 tests for single emotions correctly reported
the emotion depicted in the photograph as the dominant one, making the program about 79%
accurate if I exclude the four testing for the anger/contempt blend. Only one of the four people
who tested the anger/contempt blend correctly identified both emotions present, although the
other three were partly correct in identifying anger as the dominant emotion, but those three
36
failed to identify contempt as the next most predominant emotion. They may be viewed as partly
correct. In contrast, the one test for contempt was accurate, as was the test for anger, so evidently
facial actions are harder to identify when more than one emotion is being expressed. It may also
be that certain expressions are easier to identify than others; for example, a grin of happiness is
much more obvious than the subtle asymmetrical smirk of contempt.
Aside from the expressions they tested, and whether they were assigned or self-selected, there
was another difference between the first eight, who tested anger and disgust/contempt, and the
last ten, who tested anger, contempt, fear, happiness, sadness, and surprise. The first eight were
slight acquaintances, strangers, and friends-of-friends who responded to an e-mail request sent
by myself and my thesis advisor. Response to our requests was less-than-stellar, but eight people
obligingly submitted the form. The last ten testers, who were friends, responded to a post on my
Facebook page asking for help with my project. Their personal investment in my success was
significantly more substantial than the acquaintances, and for that reason, their test submissions
may have been more thoughtful and careful than those of the acquaintances. A large pool of
testers unknown to me personally, each of whom is randomly assigned an expression to test, and
each of whom has a desire to know the emotion expressed in the photo, with the exact same
number of test submissions for each emotion, would have provided a much better gauge of the
accuracy of the software. Since nearly everyone dislikes filling out forms, it may be that a paid
study is the only effective way to test such a program. Using the FACS categorization, which is
the industry standard, would make the software comparable to commercial products currently
available. Nevertheless, the initial results are encouraging and with further refinement, the
accuracy of the software might be expected to exceed 80%.
37
V. Conclusion
That facial expressions of emotion are unique to individuals because they are socially acquired
through learning is a commonly-held notion, although it is not validated by scientific inquiry.
Whether a computer program is capable of exhibiting the same or greater “emotional
intelligence” than a human is no longer debated among researchers in the fields of affective
computing and psychology. Expressions of many basic emotions are indeed universal, and there
is a growing market for commercial software to interpret facial expressions. The most successful
of these programs are highly accurate, reporting on not only sincere expression of emotion
(which may prove helpful to those afflicted with autism, Parkinson’s disease, and facial
paralysis) but also deception, which is of great interest to the law enforcement and intelligence
communities. The computer program described here, though simple, demonstrates that at least
seven emotions are universally expressed, and further validates Darwin’s theory of genetically
determined emotional expressions as a universal evolutionary aspect of humanity.
38
APPENDIX A: LEXICON FOR SETL ANALYZER
FACIAL ACTIONS OF UNIVERSAL EXPRESSIONS
universalEmotions := { anger, contempt, disgust, enjoyment, fear, sadness, surprise, neutral,
inconclusive };
All Actions:
facialActions := {
'b_frontalis_L_lift',
'b_frontalis_L_wrinkleHorizontal',
'b_frontalis_R_lift',
'b_frontalis_R wrinkleHorizontal',
'b_corrugator_L_lower',
'b_corrugator_L_drawInward',
'b_corrugator_L_wrinkleVertical',
'b_corrugator_R_lower',
'b_corrugator_R_drawInward',
'b_corrugator_R_wrinkleVertical',
'n_procerus_wrinklesAboveBridge',
'n_nasalisTransversa_L_raiseNostril',
'n_nasalisTransversa_R_raiseNostril',
'n_nasalisAlaris_L_expandNostril',
'n_nasalisAlaris_L_narrowNostril',
'n_nasalisAlaris_R_expandNostril',
'n_nasalisAlaris_R_narrowNostril',
'n_noseExpander_L_flareNostril',
'n_noseExpander_R_flareNostril',
'e_orbicularisOculi_L_flashbulb',
'e_orbicularisOculi_R_flashbulb',
'e_orbicularisOculi_L_widen',
'e_orbicularisOculi_R_widen',
'e_orbicularisOculi_L_narrow',
'e_orbicularisOculi_R_narrow',
'e_orbicularisOculi_L_close',
'e_orbicularisOculi_R_close',
'e_orbicularisOculi_L_crowsFeet',
'e_orbicularisOculi_R_crowsFeet',
'e_orbicularisOculi_L_downwardGaze',
'e_orbicularisOculi_R_downwardGaze',
'e_orbicularisOculi_L_ upwardAwayGaze',
'e_orbicularisOculi_R_ upwardAwayGaze',
'm_orbicularisOris_L_compress',
'm_orbicularisOris_R_compress',
'm_orbicularisOris_L_purse',
'm_orbicularisOris_R_purse',
'm_orbicularisOris_L_partRound',
'm_orbicularisOris_R_partRound',
39
'm_orbicularisOris_L_partRectangle',
'm_orbicularisOris_R_partRectangle',
'm_caninus_L_raiseUpperLipCanine',
'm_caninus_R_raiseUpperLipCanine',
'm_quadratiLabiiSuperior_L_raiseUpperLipNasolabial',
'm_quadratiLabiiSuperior_R_raiseUpperLipNasolabial',
'm_ownElevator_L_raiseUpperLipNotNostril',
'm_ownElevator_R_raiseUpperLipNotNostril',
'm_buccininatorius_L_compressCheeksWidenMouth',
'm_buccininatorius_R_compressCheeksWidenMouth',
'm_zygomaticusMinor_L_drawUpperLipBackward',
'm_zygomaticusMinor_R_drawUpperLipBackward',
'm_zygomaticusMinor_L_drawUpperLipUpward',
'm_zygomaticusMinor_R_drawUpperLipUpward',
'm_zygomaticusMinor_R_drawUpperLipBackward',
'm_zygomaticusMinor_L_drawUpperLipOutward',
'm_zygomaticusMinor_R_drawUpperLipOutward',
'm_zygomaticusMajor_L_smile',
'm_zygomaticusMajor_R_smile',
'm_risorius_L_pullMouthLaterally',
'm_risorius_R_pullMouthLaterally',
'm_quadratisLabiiInferior_L_depressLowerLip',
'm_quadratisLabiiInferior_R_depressLowerLip',
'm_quadratisLabiiInferior_L_extendLowerLip',
'm_quadratisLabiiInferior_R_extendLowerLip',
'm_triangularis_L_frown',
'm_triangularis_R_frown',
'm_mentalis_L_raiseLowerLip',
'm_mentalis_R_raiseLowerLip',
'm_mentalis_L_wrinkleChinPout',
'm_mentalis_R_wrinkleChinPout',
'm_masseter_L_openMouthWide',
'm_masseter_R_openMouthWide',
'm_masseter_L_clenchJaw',
'm_masseter_R_clenchJaw',
'm_platysma_L_drawLowerLipSideDown',
'm_platysma_R_drawLowerLipSideDown' };
ACTIONS OF UNIVERSAL EXPRESSIONS:
Anger:
bilateral/symmetric expression
$ anger
anger := {'b_corrugator_L_lower',
'b_corrugator_L_drawInward',
'b_corrugator_L_wrinkleVertical',
'b_corrugator_R_lower',
40
'b_corrugator_R_drawInward',
'b_corrugator_R_wrinkleVertical',
'e_orbicularisOculi_L_flashbulb',
'e_orbicularisOculi_R_flashbulb',
'e_orbicularisOculi_L_narrow',
'e_orbicularisOculi_R_narrow',
'n_procerus_wrinklesAboveBridge',
'n_noseExpander_L_flareNostril',
'n_noseExpander_R_flareNostril',
'm_quadratiLabiiSuperior_L_raiseUpperLipNasolabial',
'm_quadratiLabiiSuperior_R_raiseUpperLipNasolabial',
'm_masseter_L_openMouthWide',
'm_masseter_R_openMouthWide',
'm_masseter_L_clenchJaw',
'm_masseter_R_clenchJaw',
'm_quadratisLabiiInferior_L_depressLowerLip',
'm_quadratisLabiiInferior_R_depressLowerLip',
'm_quadratisLabiiInferior_L_extendLowerLip',
'm_quadratisLabiiInferior_R_extendLowerLip',
'm_triangularis_L_frown',
'm_triangularis_R_frown',
'm_mentalis_L_raiseLowerLip',
'm_mentalis_R_raiseLowerLip',
'm_mentalis_L_wrinkleChinPout',
'm_mentalis_R_wrinkleChinPout',
'm_platysma_L_draw LowerLipSideDown',
'm_platysma_R_draw LowerLipSideDown',
'm_orbicularisOris_L_compress',
'm_orbicularisOris_R_compress',
'm_orbicularisOris_L_partRectangle',
'm_orbicularisOris_R_partRectangle' };
angerReliable := { 'e_orbicularisOculi_L_flashbulb',
'e_orbicularisOculi_R_flashbulb',
'm_orbicularisOris_L_compress',
'm_orbicularisOris_R_compress' };
Contempt:
unilateral/asymmetric expression
contemptLeft := { 'm_caninus_L_raiseUpperLipCanine',
'n_noseExpander_L_flareNostril' };
contemptRight := { 'n_noseExpander_R_flareNostril',
'm_caninus_R_raiseUpperLipCanine' };
contempt:= { 'n_procerus_wrinklesAboveBridge', 'e_orbicularisOculi_L_downwardGaze',
'e_orbicularisOculi_R_downwardGaze', ‘e_orbicularisOculi_L_ upwardAwayGaze’,
41
‘e_orbicularisOculi_R_ upwardAwayGaze’,
'm_quadratiLabiiSuperior_L_raiseUpperLipNasolabial',
'm_quadratiLabiiSuperior_R_raiseUpperLipNasolabial' , contemptLeft, contemptRight } ;
mutually exclusive
contemptLeftME := { 'n_noseExpander_R_flareNostril',
'm_caninus_R_raiseUpperLipCanine' };
contemptRightME := { 'm_caninus_L_raiseUpperLipCanine',
'n_noseExpander_L_flareNostril' };
contemptReliableLeft := {'m_caninus_L_raiseUpperLipCanine' };
contemptReliableLeftME := {'m_caninus_R_raiseUpperLipCanine' };
contemptReliableRight:= {'m_caninus_R_raiseUpperLipCanine' };
contemptReliableRightME := {'m_caninus_L_raiseUpperLipCanine' };
Disgust:
bilateral/symmetric expression
disgust := { 'b_corrugator_L_lower',
'b_corrugator_L_wrinkleVertical',
'b_corrugator_R_lower',
'b_corrugator_R_wrinkleVertical',
'n_procerus_wrinklesAboveBridge',
'e_orbicularisOculi_L_narrow',
'e_orbicularisOculi_R_narrow',
'e_orbicularisOculi_L_ upwardAwayGaze',
'e_orbicularisOculi_R_ upwardAwayGaze',
'n_nasalisAlaris_L_narrowNostril',
'n_nasalisAlaris_R_narrowNostril',
'm_quadratiLabiiSuperior_L_raiseUpperLipNasolabial',
'm_quadratiLabiiSuperior_R_raiseUpperLipNasolabial',
'm_orbicularisOris_L_partRound',
'm_orbicularisOris_R_partRound',
'm_zygomaticusMinor_L_drawUpperLipUpward',
'm_zygomaticusMinor_R_drawUpperLipUpward',
'm_mentalis_L_raiseLowerLip',
'm_mentalis_R_raiseLowerLip',
'm_quadratisLabiiInferior_L_extendLowerLip',
'm_quadratisLabiiInferior_R_extendLowerLip' };
disgustReliable := {};
Enjoyment:
bilateral/symmetric expression
vertical forehead wrinkles never present in true enjoyment expression
42
enjoymentME:= { 'b_corrugator_L_wrinkleVertical',
'b_corrugator_R_wrinkleVertical' };
enjoymentCanines := { 'm_caninus_L_raiseUpperLipCanine',
'm_caninus_R_raiseUpperLipCanine' };
Both crow’s feet and raised cheeks must be present.
enjoymentCrowsNasolabial := 'e_orbicularisOculi_L_crowsFeet',
'e_orbicularisOculi_R_crowsFeet',
'm_quadratiLabiiSuperior_L_raiseUpperLipNasolabial',
'm_quadratiLabiiSuperior_R_raiseUpperLipNasolabial' };
enjoyment := { 'e_orbicularisOculi_L_narrow',
'e_orbicularisOculi_R_narrow',
enjoymentCrowsNasolabial, 'n_procerus_wrinklesAboveBridge',
'm_orbicularisOris_L_partRound',
'm_orbicularisOris_R_partRound',
enjoymentCanines,
'm_zygomaticusMajor_L_smile',
'm_zygomaticusMajor_R_smile',
'm_masseter_L_openMouthWide',
'm_masseter_R_openMouthWide' };
enjoymentReliable := { enjoymentCrowsNasolabial, 'm_zygomaticusMajor_L_smile',
'm_zygomaticusMajor_R_smile' };
Fear:
bilateral/symmetric expression
fear := { 'b_frontalis_L_lift',
'b_frontalis_L_wrinkleHorizontal',
'b_frontalis_R_lift',
'b_frontalis_R wrinkleHorizontal',
'b_corrugator_L_drawInward',
'b_corrugator_R_drawInward',
'e_orbicularisOculi_L_flashbulb',
'e_orbicularisOculi_R_flashbulb',
'e_orbicularisOculi_L_widen',
'e_orbicularisOculi_R_widen',
'm_risorius_L_pullMouthLaterally',
'm_risorius_R_pullMouthLaterally',
'm_platysma_L_draw LowerLipSideDown',
'm_platysma_R_draw LowerLipSideDown',
'm_buccininatorius_L_compressCheeksWidenMouth',
'm_buccininatorius_R_compressCheeksWidenMouth' };
43
fearReliable := { 'b_frontalis_L_lift',
'b_frontalis_R_lift',
'b_corrugator_L_drawInward',
'b_corrugator_R_drawInward',
'e_orbicularisOculi_L_flashbulb',
'e_orbicularisOculi_R_flashbulb' };
Sadness:
sadness := { 'b_frontalis_L_wrinkleHorizontal',
'b_frontalis_R wrinkleHorizontal',
'b_corrugator_L_lower',
'b_corrugator_L_drawInward',
'b_corrugator_R_lower',
'b_corrugator_R_drawInward',
'n_procerus_wrinklesAboveBridge',
'e_orbicularisOculi_L_narrow',
'e_orbicularisOculi_R_narrow',
'e_orbicularisOculi_L_close',
'e_orbicularisOculi_R_close',
'e_orbicularisOculi_L_downwardGaze',
'e_orbicularisOculi_R_downwardGaze',
'n_nasalisAlaris_L_narrowNostril',
'n_nasalisAlaris_R_narrowNostril',
'm_quadratiLabiiSuperior_L_raiseUpperLipNasolabial',
'm_quadratiLabiiSuperior_R_raiseUpperLipNasolabial',
'm_quadratisLabiiInferior_L_depressLowerLip',
'm_quadratisLabiiInferior_R_depressLowerLip',
'm_triangularis_L_frown',
'm_triangularis_R_frown',
'm_mentalis_L_wrinkleChinPout',
'm_mentalis_R_wrinkleChinPout',
'm_platysma_L_drawLowerLipSideDown',
'm_platysma_R_drawLowerLipSideDown' };
sadnessReliable := { 'b_corrugator_L_lower',
'b_corrugator_L_drawInward',
'b_corrugator_R_lower',
'b_corrugator_R_drawInward' };
Surprise:
bilateral/symmetric expression
surprise := {'b_frontalis_L_lift', 'b_frontalis_L_wrinkleHorizontal', 'b_frontalis_R_lift',
'b_frontalis_R wrinkleHorizontal', 'e_orbicularisOculi_L_flashbulb',
'e_orbicularisOculi_R_flashbulb', 'e_orbicularisOculi_L_widen', 'e_orbicularisOculi_R_widen',
'm_masseter_L_openMouthWide', 'm_masseter_R_openMouthWide',
'm_orbicularisOris_L_partRound', 'm_orbicularisOris_R_partRound', }
44
Inconclusive:
inconclusive := {};
45
APPENDIX B: SETL PROGRAM TO ANALYZE FACIAL EXPRESSIONS
program emotions;
$ all actions
facialActions :=
{'b_frontalis_lift','b_frontalis_wrinkleHorizontal','b_corrugator_lower','b_corrugator_drawInwar
d','b_corrugator_wrinkleVertical','n_procerus_wrinklesAboveBridge','n_nasalisTransversa_raise
Nostril','n_nasalisAlaris_expandNostril','n_nasalisAlaris_narrowNostril','n_noseExpander_flareN
ostril','e_orbicularisOculi_flashbulb','e_orbicularisOculi_widen','e_orbicularisOculi_narrow','e_o
rbicularisOculi_close','e_orbicularisOculi_crowsFeet','e_orbicularisOculi_downwardGaze','m_or
bicularisOris_compress','m_orbicularisOris_purse','m_orbicularisOris_partRound','m_caninus_ra
iseUpperLipCanine','m_quadratiLabiiSuperior_raiseUpperLipNasolabial','m_ownElevator_raise
UpperLipNotNostril','m_buccininatorius_compressCheeksWidenMouth','m_zygomaticusMinor_
drawUpperLipBackward','m_zygomaticusMinor_drawUpperLipUpward','m_zygomaticusMinor_
drawUpperLipOutward','m_zygomaticusMajor_smile','m_risorius_pullMouthLaterally','m_quadr
atisLabiiInferior_depressLowerLip','m_quadratisLabiiInferior_extendLowerLip','m_triangularis_
frown','m_mentalis_raiseLowerLip','m_mentalis_wrinkleChinPout','m_masseter_openMouthWid
e','m_masseter_clenchJaw','m_platysma_draw_LowerLipSideDown'};
$ anger
anger :=
{'b_corrugator_lower_L','b_corrugator_drawInward_L','b_corrugator_wrinkleVertical_L','b_corr
ugator_lower_R','b_corrugator_drawInward_R','b_corrugator_wrinkleVertical_R','e_orbicularis
Oculi_flashbulb_L','e_orbicularisOculi_flashbulb_R','e_orbicularisOculi_narrow_L','e_orbiculari
sOculi_narrow_R','n_procerus_wrinklesAboveBridge_L','n_procerus_wrinklesAboveBridge_R','
n_noseExpander_flareNostril_L','n_noseExpander_flareNostril_R','m_quadratiLabiiSuperior_rai
seUpperLipNasolabial_L','m_quadratiLabiiSuperior_raiseUpperLipNasolabial_R','m_masseter_o
penMouthWide_L','m_masseter_openMouthWide_R','m_masseter_clenchJaw_L','m_masseter_cl
enchJaw_R','m_quadratisLabiiInferior_depressLowerLip_L','m_quadratisLabiiInferior_depressL
owerLip_R','m_quadratisLabiiInferior_extendLowerLip_L','m_quadratisLabiiInferior_extendLo
werLip_R','m_triangularis_frown_L','m_triangularis_frown_R','m_mentalis_raiseLowerLip_L','
m_mentalis_raiseLowerLip_R','m_mentalis_wrinkleChinPout_L','m_mentalis_wrinkleChinPout
_R','m_platysma_drawLowerLipSideDown_L','m_platysma_drawLowerLipSideDown_R','m_or
bicularisOris_compress_L','m_orbicularisOris_compress_R','m_orbicularisOris_partRectangle_L
','m_orbicularisOris_partRectangle_R'};
$ another test for anger is whether it is visible in all three areas of the face
angerReliable :=
{'e_orbicularisOculi_flashbulb_L','e_orbicularisOculi_flashbulb_R','m_orbicularisOris_compres
s_L','m_orbicularisOris_compress_R'};
$ contempt
contemptCanineLeft := {'m_caninus_raiseUpperLipCanine_L'};
46
contemptCanineRight := {'m_caninus_raiseUpperLipCanine_R'};
contemptLeft := {'m_caninus_raiseUpperLipCanine_L','n_noseExpander_flareNostril_L'};
contemptLeftME := {'n_noseExpander_flareNostril_R','m_caninus_raiseUpperLipCanine_R'};
contemptRight := {'n_noseExpander_flareNostril_R','m_caninus_raiseUpperLipCanine_R'};
contemptRightME := {'m_caninus_raiseUpperLipCanine_L','n_noseExpander_flareNostril_L'};
contempt:=
{'n_procerus_wrinklesAboveBridge_L','n_procerus_wrinklesAboveBridge_R','e_orbicularisOcul
i_downwardGaze_R','e_orbicularisOculi_downwardGaze_L','e_orbicularisOculi_
upwardAwayGaze_L','e_orbicularisOculi_upwardAwayGaze_R','m_quadratiLabiiSuperior_raise
UpperLipNasolabial_L','m_quadratiLabiiSuperior_raiseUpperLipNasolabial_R'
,{contemptLeft},{contemptRight}};
$ symmetric difference - exclusive or - for contempt's unilateral reliable indicator
contemptReliable := contemptCanineLeft mod contemptCanineRight;
$ disgust
disgust :=
{'b_corrugator_lower_L','b_corrugator_wrinkleVertical_L','b_corrugator_lower_R','b_corrugator
_wrinkleVertical_R','n_procerus_wrinklesAboveBridge_L','n_procerus_wrinklesAboveBridge_R
','e_orbicularisOculi_narrow_L','e_orbicularisOculi_narrow_R','e_orbicularisOculi_upwardAway
Gaze_L','e_orbicularisOculi_upwardAwayGaze_R','n_nasalisAlaris_narrowNostril_L','n_nasalis
Alaris_narrowNostril_R','m_quadratiLabiiSuperior_raiseUpperLipNasolabial_L','m_quadratiLab
iiSuperior_raiseUpperLipNasolabial_R','m_orbicularisOris_partRound_L','m_orbicularisOris_pa
rtRound_R','m_zygomaticusMinor_drawUpperLipUpward_L','m_zygomaticusMinor_drawUppe
rLipUpward_R','m_mentalis_raiseLowerLip_L','m_mentalis_raiseLowerLip_R','m_quadratisLab
iiInferior_extendLowerLip_L','m_quadratisLabiiInferior_extendLowerLip_R'};
disgustReliable := {};
$ enjoyment
$ vertical forehead wrinkles never present in true enjoyment expression
enjoymentME:= {'b_corrugator_wrinkleVertical_L','b_corrugator_wrinkleVertical_R'};
enjoymentCanines :=
{'m_caninus_raiseUpperLipCanine_L','m_caninus_raiseUpperLipCanine_R'};
$ Both crow's feet and raised cheeks must be present in enjoyment
47
enjoymentCrowsNasolabial :=
{'e_orbicularisOculi_crowsFeet_L','e_orbicularisOculi_crowsFeet_R','m_quadratiLabiiSuperior_
raiseUpperLipNasolabial_L','m_quadratiLabiiSuperior_raiseUpperLipNasolabial_R'};
enjoyment :=
{'e_orbicularisOculi_narrow_L','e_orbicularisOculi_narrow_R',{enjoymentCrowsNasolabial},'n
_procerus_wrinklesAboveBridge_L','n_procerus_wrinklesAboveBridge_R','m_orbicularisOris_p
artRound_L','m_orbicularisOris_partRound_R',{enjoymentCanines},'m_zygomaticusMajor_smil
e_L','m_zygomaticusMajor_smile_R','m_masseter_openMouthWide_L','m_masseter_openMout
hWide_R'};
enjoymentReliable := {'m_zygomaticusMajor_smile_L','m_zygomaticusMajor_smile_R',
{enjoymentCrowsNasolabial}};
$ fear
fear :=
{'b_frontalis_lift_L','b_frontalis_wrinkleHorizontal_L','b_frontalis_lift_R',b_frontalis_wrinkleHo
rizontal_R,'b_corrugator_drawInward_L','b_corrugator_drawInward_R','e_orbicularisOculi_flas
hbulb_L','e_orbicularisOculi_flashbulb_R','e_orbicularisOculi_widen_L','e_orbicularisOculi_wid
en_R','m_risorius_pullMouthLaterally_L','m_risorius_pullMouthLaterally_R','m_platysma_draw
LowerLipSideDown_L','m_platysma_drawLowerLipSideDown_R','m_buccininatorius_compres
sCheeksWidenMouth_L','m_buccininatorius_compressCheeksWidenMouth_R'};
$ check whether eyebrow lift is really reliable
fearReliable :=
{'b_frontalis_lift_L','b_frontalis_lift_R','b_corrugator_drawInward_L','b_corrugator_drawInward
_R','e_orbicularisOculi_flashbulb_L','e_orbicularisOculi_flashbulb_R'};
$ sadness
sadness :=
{'b_frontalis_wrinkleHorizontal_L',b_frontalis_wrinkleHorizontal_R,'b_corrugator_lower_L','b_
corrugator_drawInward_L','b_corrugator_lower_R','b_corrugator_drawInward_R','n_procerus_w
rinklesAboveBridge_L','n_procerus_wrinklesAboveBridge_R','e_orbicularisOculi_narrow_L','e_
orbicularisOculi_narrow_R','e_orbicularisOculi_close_L','e_orbicularisOculi_close_R','e_orbicul
arisOculi_downwardGaze_L','e_orbicularisOculi_downwardGaze_R','n_nasalisAlaris_narrowNo
stril_L','n_nasalisAlaris_narrowNostril_R','m_quadratiLabiiSuperior_raiseUpperLipNasolabial_
L','m_quadratiLabiiSuperior_raiseUpperLipNasolabial_R','m_quadratisLabiiInferior_depressLo
werLip_L','m_quadratisLabiiInferior_depressLowerLip_R','m_triangularis_L_frown','m_triangul
aris_frown_R','m_mentalis_wrinkleChinPout_L','m_mentalis_wrinkleChinPout_R','m_platysma_
drawLowerLipSideDown_L','m_platysma_drawLowerLipSideDown_R'};
sadnessReliable :=
{'b_corrugator_lower_L','b_corrugator_drawInward_L','b_corrugator_lower_R','b_corrugator_dr
awInward_R'};
$ surprise
48
surprise :=
{'b_frontalis_lift_R','b_frontalis_lift_L','b_frontalis_wrinkleHorizontal_L','b_frontalis_wrinkleH
orizontal_R','e_orbicularisOculi_flashbulb_L','e_orbicularisOculi_flashbulb_R','e_orbicularisOc
uli_widen_L','e_orbicularisOculi_widen_R','m_masseter_openMouthWide_L','m_masseter_open
MouthWide_R','m_orbicularisOris_partRound_L','m_orbicularisOris_partRound_R'};
surpriseReliable := {};
$ default or inconclusive neutral := {};
inconclusive := {};
universalEmotions := {
anger,contempt,disgust,enjoyment,fear,sadness,surprise,neutral,inconclusive };
$ input, parsed, and tokenized sets
leftCsvSet := {};
rightCsvSet := {};
sumCsvSet := {};
csvSplitL := {};
csvSplitR := {};
$ input some csv values
write ('enter some csv values for the left side of the face:');
get(csvL);
$ parse text
csvSplitL := split(csvL,",");
$ populate left set
(for a in csvSplitL)
leftCsvSet with:= a + '_L';
end;
print();
write ('is the right side symmetrical to the left - Y/N ?:');
get(answer);
case of
(answer = 'y' or answer = 'Y'):
$ copy left side actions to right
(for a in csvSplitL)
rightCsvSet with:= a + '_R';
end;
(answer = 'n' or answer = 'N'):
$ input some csv values
49
write ('enter some csv values for the right side of the face:');
get(csvR);
$ parse text
csvSplitR := split(csvR,",");
$ populate right side actions to set
(for a in csvSplitR)
rightCsvSet with := a + '_R';
end;
else
$ error handling ought to be here
print('you ought to type more carefully.');
end case;
sumCsvSet := leftCsvSet + rightCsvSet;
$ intersections of input and emotion sets
angerSet := sumCsvSet * anger;
contemptSet := sumCsvSet * contempt;
disgustSet := sumCsvSet * disgust;
enjoymentSet := sumCsvSet * enjoyment;
fearSet := sumCsvSet * fear;
sadnessSet := sumCsvSet * sadness;
surpriseSet := sumCsvSet * surprise;
emotionTally :={ #angerSet, #contemptSet, #disgustSet, #enjoymentSet, #fearSet, #sadnessSet,
#surpriseSet };
maxEmotion := 0;
$ intersections of input and emotion reliable sets
angerReliableSet := sumCsvSet * angerReliable;
contemptReliableSet := sumCsvSet * contemptReliable;
disgustReliableSet := sumCsvSet * disgustReliable;
enjoymentReliableSet := sumCsvSet * enjoymentReliable;
fearReliableSet := sumCsvSet * fearReliable;
sadnessReliableSet := sumCsvSet * sadnessReliable;
surpriseReliableSet := sumCsvSet * surpriseReliable;
reliableEmotionTally :={ #angerReliableSet, #contemptReliableSet, #disgustReliableSet,
#enjoymentReliableSet, #fearReliableSet, #sadnessReliableSet, #surpriseReliableSet };
maxReliableEmotion := 0;
print();
$ print left set
50
print('new left set:');
(for a in leftCsvSet)
print(a);
end;
print();
$ print right set
print('new right set:');
(for a in rightCsvSet)
print(a);
end;
print();
$ print whole input set
print('new whole set:');
(for a in sumCsvSet)
print(a);
end;
print();
$ intersection of input and anger - at least one common element
$ the reliable set is flashbulb eyes and compressed lips
$ an additional test for anger should be: is at least one of its elements visible in all three areas of
the face?
write ('Testing for anger');
$ with only seven emotions this could be hard coded instead of conditional
if (#angerSet > 0)
then write ('Input is contained in the anger set. Input contains '+ #angerSet + ' of the ' + #anger
+ ' elements of anger.');
if (#angerReliable > 0)
then write('There is a reliable set for anger. Input contains ' + #angerReliableSet + ' of the ' +
#angerReliable + ' reliable elements of anger.');
else
write('There is no reliable set for anger');
end if;
else
write('Input not found in the anger set.');
end if;
print();
$ intersection of input and contempt - at least one common element
51
$ here the reliable indicator of contempt is exposing the canines on one side or the other, but not
both
write ('Testing for contempt');
if (#contemptSet > 0)
then write ('Input is contained in the contempt set. Input contains '+ #contemptSet + ' of the ' +
#contempt + ' elements of contempt.');
if #contemptReliable > 0
then write('There is a reliable set for contempt. Input contains ' + #contemptReliableSet + ' of
the ' + #contemptReliable + ' reliable elements of contempt.');
else
write('There is no reliable set for contempt');
end if;
else
write('Input not found in the contempt set.');
end if;
print();
$ intersection of input and disgust - at least one common element
$ there is no reliable indicator of disgust
write ('Testing for disgust');
if (#disgustSet > 0)
then write ('Input is contained in the disgust set. Input contains '+ #disgustSet + ' of the ' +
#disgust + ' elements of disgust.');
if #disgustReliable > 0
then write('There is a reliable set for disgust. Input contains ' + #disgustReliableSet + ' of the '
+ #disgustReliable + ' reliable elements of disgust.');
else
write('There is no reliable set for disgust');
end if;
else
write('Input not found in the disgust set.');
end if;
print();
$ intersection of input and enjoyment - at least one common element
$ crow's feet and raised cheeks are the reliable indicators
write ('Testing for enjoyment');
if (#enjoymentSet > 0)
then write ('Input is contained in the enjoyment set. Input contains '+ #enjoymentSet + ' of the '
+ #enjoyment + ' elements of enjoyment.');
if #enjoymentReliable > 0
then write('There is a reliable set for enjoyment. Input contains ' + #enjoymentReliableSet + '
of the ' + #enjoymentReliable + ' reliable elements of enjoyment.');
else
write('There is no reliable set for enjoyment');
end if;
else
52
write('Input not found in the enjoyment set.');
end if;
print();
$ intersection of input and fear - at least one common element
$ the reliable indicators are flashbulb eyes with raised brow or brows lowered and drawn inward
write ('Testing for fear');
if (#fearSet > 0)
then write ('Input is contained in the fear set. Input contains '+ #fearSet + ' of the ' + #fear + '
elements of fear.');
if #fearReliable > 0
then write('There is a reliable set for fear. Input contains ' + #fearReliableSet + ' of the ' +
#fearReliable + ' reliable elements of fear.');
else
write('There is no reliable set for fear');
end if;
else
write('Input not found in the fear set.');
end if;
print();
$ intersection of input and sadness - at least one common element
$ reliable set for sadness is brow lowered and drawn inward
write ('Testing for sadness');
if (#sadnessSet > 0)
then write ('Input is contained in the sadness set. Input contains '+ #sadnessSet + ' of the ' +
#sadness + ' elements of sadness.');
if #sadnessReliable > 0
then write('There is a reliable set for sadness. Input contains ' + #sadnessReliableSet + ' of the
' + #sadnessReliable + ' reliable elements of sadness.');
else
write('There is no reliable set for sadness');
end if;
else
write('Input not found in the sadness set.');
end if;
print();
$ intersection of input and surprise - at least one common element
$ the reliable indicator of surprise is temporal - it is a fleeting emotion, not detectable by my
program
write ('Testing for surprise');
if (#surpriseSet > 0)
then write ('Input is contained in the surprise set. Input contains '+ #surpriseSet + ' of the ' +
#surprise + ' elements of surprise.');
if #surpriseReliable > 0
53
then write('There is a reliable set for surprise. Input contains ' + #surpriseReliableSet + ' of the
' + #surpriseReliable + ' reliable elements of enjoyment.');
else
write('There is no reliable set for surprise');
end if;
else
write('Input not found in the surprise set.');
end if;
print();
end emotions;
54
APPENDIX C: FORM TO ACCEPT INPUT FOR
ANALYSIS OF FACIAL EXPRESSIONS
This form, with example images illustrating disgust and a neutral expression, is located at
http://appsrv.pace.edu/lubin/dev/rk/identifyEmotion.html The images for the facial actions are
from Artanatomy by Victoria Contreras Flores (illustrations) and Emotion in the Human Face by
Paul Ekmam (photographs).
INSTRUCTIONS: Look at the photo on the right and choose the actions(s) that best represent
the photo. Choose all that apply (may also choose none). Be sure to look at the subject's neutral
expression for comparison. Note only the differences between the expression photo and the
neutral photo (actions present in the neutral photo do not indicate emotion). If you have problems
with this form, e-mail [email protected]
1) EYEBROWS
Look at the eyebrows and choose the action(s) that best represent the action depicted in the
photo.
Choose as many as apply (may also choose none):
expression neutral
vertical lines between brows
horizontal wrinkles across
entire forehead
horizontal wrinkes across
center only of forehead
Look at the eyebrows and choose the action that best represent the action depicted in the photo.
Choose only one (may also choose none):
expression neutral
lowered brow
lowered brow drawn inward
brow lowered with inner
corners drawn together and up
55
brows raised high and
arched
brow raised and with inner
corners drawn together
none of the above
2) EYES
Look at the eyes and choose the action(s) that best represent the action depicted in the photo.
Choose all that apply (may also choose none):
expression neutral
lower lids raised but not tense
whites visible above the iris,
or above and below iris
lower lid raised, but not tense,
narrowing eyes,
with lines beneath the eye
tensed upper lids with lowered
brow and
upper lid lowered and covering
part of iris
tensed upper lids
wrinkles or 'bags' beneath
lower lids
'crow's feet' wrinkles at outer
edges
lower eyelid tensed
56
glaring or narrowed eyes glaring
narrowed
upper lid raised and lower lid
relaxed
upper lid raised and lower lid
tensed
upper lid, inward corner
raised, giving a
triangular shape to eye
eyebrow and eye cover fold
slightly lowered, narrowing the
eye
Look at the eyes and choose the action that best represent the action depicted in the photo.
Choose only one:
upward or away gaze with
head tilted
may also be upward gaze
downward gaze
neither (gaze straight ahead)
3) NOSE
3) Look at the nose and choose the action that best represent the action depicted in the photo.
Choose only one:
moderate wrinkles across
sides and bridge of nose
57
extreme wrinkles across
sides and bridge of nose
nostril visibly raised on one
side or the other (not both)
may be raised on either side
none of the above
Look at the nose - are the nostrils dilated?
dilated nostrils
not dilated
4) MOUTH
4) Look at the mouth and choose the action(s) that best represent the action depicted in the
photo.
Choose all that apply (may also choose none):
lower lip lowered and
pushed out, exposing the
teeth and possibly tongue
upper lip raised very high
and close to the nose
upper lip moderately raised
58
Look at the mouth and choose the action that best represent the action depicted in the photo.
Choose only one (may also choose none):
lower lip raised and pushed
up near upper lip
lower lip lowered and
pushed out, exposing the teeth
neither of above
Look at the mouth and choose the action that best represent the action depicted in the photo.
Choose only one (may also choose none):
corners of lips drawn up and
back
(smile) possibly exposing teeth.
no teeth
teeth visible
corners drawn down
(frown)
slightly open, rounded
relaxed mouth
narrowed lips pressed
together
mouth open in horizontal
shape, as if shouting
lips stretched and tense with
corners drawn back
rectangular open, tense
mouth
59
mouth open and parted with
lower lip raised
moderate or wide open,
rounded, relaxed mouth
gaping, rounded, relaxed,
dropped jaw
lips pressed together and
outer corner visibly raised on
one side
or the other (not both), possibly
exposing the canine tooth
may be raised on either side
none of above
Look at the mouth and choose the action that best represent the action depicted in the photo.
Choose only one (may also choose none):
cheeks raised with visible
naso-labial fold
from nostril to outer corner of
mouth
cheeks raised with deep
naso-labial fold
from nostril to outer corner of
mouth
neither of above
SUBMIT FORM CLEAR FORM
(the above are standard HTML form buttons)
60
APPENDIX D: SQL/COLDFUSION PROGRAM TO ANALYZE INPUT
AND REPORT ON PRESENCE OF FACIAL EXPRESSIONS
<!---retrieve all inputted labels--->
<cfquery name="getLabels" datasource="LubinDB">
SELECT label, weight, [emotion]
FROM emotion
WHERE label IN ('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
'#FORM.e1#',
'#FORM.e2#',
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
<!---queries for each emotion--->
<cfquery name="getAngerTally" dbtype="query">
SELECT label, weight, [emotion]
FROM getLabels
WHERE [emotion] = 'anger'
</cfquery>
<cfquery name="getContemptTally" dbtype="query">
SELECT label, weight, [emotion]
FROM getLabels
WHERE [emotion] = 'contempt'
</cfquery>
61
<cfquery name="getDisgustTally" dbtype="query">
SELECT label, weight, [emotion]
FROM getLabels
WHERE [emotion] = 'disgust'
</cfquery>
<cfquery name="getFearTally" dbtype="query">
SELECT label, weight, [emotion]
FROM getLabels
WHERE [emotion] = 'fear'
</cfquery>
<cfquery name="getHappinessTally" dbtype="query">
SELECT label, weight, [emotion]
FROM getLabels
WHERE [emotion] = 'happiness'
</cfquery>
<cfquery name="getSadnessTally" dbtype="query">
SELECT label, weight, [emotion]
FROM getLabels
WHERE [emotion] = 'sadness'
</cfquery>
<cfquery name="getSurpriseTally" dbtype="query">
SELECT label, weight, [emotion]
FROM getLabels
WHERE [emotion] = 'surprise'
</cfquery>
<!---weighted average query for anger--->
<cfquery name="getAngerWeight" datasource="LubinDB">
SELECT sum(weight) "angerSum"
FROM emotion
WHERE [emotion] = 'anger'
AND label IN ('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
'#FORM.e1#',
'#FORM.e2#',
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
62
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
<!---weighted average query for contempt--->
<cfquery name="getContemptWeight" datasource="LubinDB">
SELECT sum(weight) "contemptSum"
FROM emotion
WHERE [emotion] = 'contempt'
AND label IN ('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
'#FORM.e1#',
'#FORM.e2#',
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
63
<!---weighted average query for disgust--->
<cfquery name="getDisgustWeight" datasource="LubinDB">
SELECT sum(weight) "disgustSum"
FROM emotion
WHERE [emotion] = 'disgust'
AND label IN ('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
'#FORM.e1#',
'#FORM.e2#',
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
<!---weighted average query for fear--->
<cfquery name="getFearWeight" datasource="LubinDB">
SELECT sum(weight) "fearSum"
FROM emotion
WHERE [emotion] = 'fear'
AND label IN ('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
'#FORM.e1#',
'#FORM.e2#',
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
64
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
<!---weighted average query for happiness--->
<cfquery name="getHappinessWeight" datasource="LubinDB">
SELECT sum(weight) "happinessSum"
FROM emotion
WHERE [emotion] = 'happiness'
AND label IN ('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
'#FORM.e1#',
'#FORM.e2#',
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
65
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
<!---weighted average query for sadness--->
<cfquery name="getSadnessWeight" datasource="LubinDB">
SELECT sum(weight) "sadnessSum"
FROM emotion
WHERE [emotion] = 'sadness'
AND label IN ('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
'#FORM.e1#',
'#FORM.e2#',
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
<!---weighted average query for surprise--->
<cfquery name="getSurpriseWeight" datasource="LubinDB">
SELECT sum(weight) "surpriseSum"
FROM emotion
WHERE [emotion] = 'surprise'
AND label IN ('#FORM.b1#',
'#FORM.b3#',
'#FORM.b5#',
'#FORM.bex#',
'#FORM.e1#',
'#FORM.e2#',
66
'#FORM.e3#',
'#FORM.e4#',
'#FORM.e5#',
'#FORM.e6#',
'#FORM.e7#',
'#FORM.e8#',
'#FORM.e9#',
'#FORM.e10#',
'#FORM.e11#',
'#FORM.e12#',
'#FORM.e13#',
'#FORM.e14#',
'#FORM.eex#',
'#FORM.n1#',
'#FORM.nex#',
'#FORM.m1ex#',
'#FORM.m11#',
'#FORM.m12#',
'#FORM.m5#',
'#FORM.m2ex#',
'#FORM.m3ex#')
</cfquery>
<CFSET angerWeight = getAngerWeight.angerSum>
<CFSET contemptWeight = getContemptWeight.contemptSum>
<CFSET disgustWeight = getDisgustWeight.disgustSum>
<CFSET fearWeight = getFearWeight.fearSum>
<CFSET happinessWeight = getHappinessWeight.happinessSum>
<CFSET sadnessWeight = getSadnessWeight.sadnessSum>
<CFSET surpriseWeight = getSurpriseWeight.surpriseSum>
<!---
Weighted score for each emotion:<BR><BR>
<cfoutput query="getAngerWeight">
<cfif angerWeight gt 0>
Anger: #getAngerWeight.angerSum# <BR>
<cfelse>
Anger: 0.00<BR>
</cfif>
</cfoutput>
<cfoutput query="getContemptWeight">
<cfif contemptWeight gt 0>
Contempt: #getContemptWeight.contemptSum# <BR>
<cfelse>
Contempt: 0.00<BR>
</cfif>
</cfoutput>
67
<cfoutput query="getDisgustWeight">
<cfif disgustWeight gt 0>
Disgust: #getDisgustWeight.disgustSum# <BR>
<cfelse>
Disgust: 0.00<BR>
</cfif>
</cfoutput>
<cfoutput query="getFearWeight">
<cfif fearWeight gt 0>
Fear: #getFearWeight.fearSum# <BR>
<cfelse>
Fear: 0.00<BR>
</cfif>
</cfoutput>
<cfoutput query="getHappinessWeight">
<cfif happinessWeight gt 0>
Happiness: #getHappinessWeight.happinessSum# <BR>
<cfelse>
Happiness: 0.00<BR>
</cfif>
</cfoutput>
<cfoutput query="getSadnessWeight">
<cfif sadnessWeight gt 0>
Sadness: #getSadnessWeight.sadnessSum# <BR>
<cfelse>
Sadness: 0.00<BR>
</cfif>
</cfoutput>
<cfoutput query="getSurpriseWeight">
<cfif surpriseWeight gt 0>
Surprise: #getSurpriseWeight.surpriseSum# <BR>
<cfelse>
Surprise: 0.00<BR>
</cfif>
</cfoutput>
<CFMAIL to='[email protected]'
from='[email protected]'
Subject='Facial Analysis submission'
server='email.pace.edu'>
Below is the information submitted:
68
Weighted score for each emotion:
Anger: #getAngerWeight.angerSum#
Contempt: #getContemptWeight.contemptSum#
Disgust: #getDisgustWeight.disgustSum#
Fear: #getFearWeight.fearSum#
Happiness: #getHappinessWeight.happinessSum#
Sadness: #getSadnessWeight.sadnessSum#
Surprise: #getSurpriseWeight.surpriseSum#
</CFMAIL>
<html>
<head>
<style>body {margin: 100px 200px 100px 200px;}</style>
</head>
<body>
<FONT style="FILTER: ; FONT: 14px Arial,Geneva,sans-serif;">Thank you for helping out
with my thesis project. If you have any problems with this form, e-mail <A
HREF="mailto:[email protected]">[email protected]</A>.
<BR>
</body>
</html>
69
References
[1] "Pathognomy." Def. 1. The Oxford English Dictionary. 1st. ed. 1971.
[2] Krumhuber, Eva G. and Antony S. R. Manstead. “Can Duchenne smiles be feigned? New evidence
on felt and false smiles.” Emotion, December 2009
[3] Duchenne de Boulogne. The Mechanism of Human Facial Expression. Cambridge: Cambridge
University Press, 1990. Print.
[4] Darwin, Charles. Charles Darwin Notebooks 1836-1844. Cambridge: Cambridge University Press,
2009. Print
[5] Bernstein, Douglas. The Essentials of Psychology, Kentucky: Wadsworth Publishing, 2010. Print.
[6] James, William. The Principles of Psychology, Volume 2. Connecticut: Martino Publishing, 2010.
Print.
[7] Finzi, Eric. The Face of Emotion: How Botox Affects Our Moods and Relationships. New York:
Palgrave MacMillan, 2013. Print.
[8] Niedenthal, Paula M . "Embodying Emotion," Science, May 2007
[9] Restak , Richard M. The Modular Brain. New York: Scribner, 1994. Print
[10] Ekman, Paul and W.V. Friesen. "Constants across cultures in the face and emotion."
Journal of Personality and Social Psychology 17, February 1971
[11] Ekman, Paul and Dali Lama. Emotional Awareness: Overcoming the Obstacles to Psychological
Balance and Compassion. New York: Holt Paperbacks, 2009. Print.
[12] Gladwell, Malcolm. “The Naked Face.” The New Yorker, August 5, 2002.
[13] Ekman, Paul and Maureen O'Sullivan. “Lying and deceit – The Wizards Project.” American
Medical Association's 23rd Annual Science Reporters Conference, October 2004
[14] Bartlett, Martin, J.C. Hager, Paul Ekman, and T.J Sejnowski. “Measuring facial
expressions by computer image analysis,” Psychophysiology, March 1999
[15] Michael, Nicholas, Mark Dilsizian, Dimitris Metaxas, and Judee K. Burgoony.
“Motion Profiles for Deception Detection Using Visual Cues.” ECCV'10
Proceedings of the 11th European conference on Computer vision: Part VI, Berlin:
Springer Berlin Heidelberg, 2010
[16] “How Smiles Control Us All, and Why We’re Terrible At Faking Them,” by Eric
Finzi, The Atlantic, January 30, 2013
[17] “The Social-Cue Reader,” by Jennifer Schuessler, New York Times, December 10,
2006
[18] “Faces, Too, Are Searched at U.S. Airports," by Eric Lipton, New York Times,
August 17, 2006
[19] Matsumoto, David. “Cultural similarities and differences in display rules,”
Motivation and Emotion 14, 1990
[20] Matsumoto, David and Bob Willingham. “Spontaneous Facial Expressions of
Emotion of Congenitally and Noncongenitally Blind Individuals,” Journal of
Personality and Social Psychology, January 2009
[21] Peleg, Gili, Gadi Katzir, Ofer Peleg, Michal Kamara, Leonid Brodsky, Hagit Hel-Or,
Daniel Keren, and Eviatar Nevo “Hereditary family signature of facial expression,”
Proceedings of the National Academy of Sciences in the United States of America,
October 2006
[22] Lanteaume, Laura, Stéphanie Khalfa, Jean Régis, Patrick Marquis, Patrick Chauvel,
and Fabrice Bartolomei, "Emotion Induction After Direct Intracerebral Stimulations
of Human Amygdala," Cerebral Cortex, June 2007
[23] Matsumoto, David and Hyi Sung Hwang. “Reading facial expressions of emotion,”
Psychological Science Agenda, May 2011