cognition & emotion presentation and validation of the ... · knippenberg, ad(2010)...

13
PLEASE SCROLL DOWN FOR ARTICLE This article was downloaded by: [Radboud University Nijmegen] On: 24 November 2010 Access details: Access Details: [subscription number 907172236] Publisher Psychology Press Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37- 41 Mortimer Street, London W1T 3JH, UK Cognition & Emotion Publication details, including instructions for authors and subscription information: http://www.informaworld.com/smpp/title~content=t713682755 Presentation and validation of the Radboud Faces Database Oliver Langner a ; Ron Dotsch a ; Gijsbert Bijlstra a ; Daniel H. J. Wigboldus a ; Skyler T. Hawk b ; Ad van Knippenberg a a Radboud University Nijmegen, Nijmegen, The Netherlands b University of Amsterdam, Amsterdam, The Netherlands Online publication date: 22 November 2010 To cite this Article Langner, Oliver , Dotsch, Ron , Bijlstra, Gijsbert , Wigboldus, Daniel H. J. , Hawk, Skyler T. and van Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 — 1388 To link to this Article: DOI: 10.1080/02699930903485076 URL: http://dx.doi.org/10.1080/02699930903485076 Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

Upload: others

Post on 03-Jun-2020

11 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

PLEASE SCROLL DOWN FOR ARTICLE

This article was downloaded by: [Radboud University Nijmegen]On: 24 November 2010Access details: Access Details: [subscription number 907172236]Publisher Psychology PressInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Cognition & EmotionPublication details, including instructions for authors and subscription information:http://www.informaworld.com/smpp/title~content=t713682755

Presentation and validation of the Radboud Faces DatabaseOliver Langnera; Ron Dotscha; Gijsbert Bijlstraa; Daniel H. J. Wigboldusa; Skyler T. Hawkb; Ad vanKnippenberga

a Radboud University Nijmegen, Nijmegen, The Netherlands b University of Amsterdam, Amsterdam,The Netherlands

Online publication date: 22 November 2010

To cite this Article Langner, Oliver , Dotsch, Ron , Bijlstra, Gijsbert , Wigboldus, Daniel H. J. , Hawk, Skyler T. and vanKnippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —1388To link to this Article: DOI: 10.1080/02699930903485076URL: http://dx.doi.org/10.1080/02699930903485076

Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf

This article may be used for research, teaching and private study purposes. Any substantial orsystematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply ordistribution in any form to anyone is expressly forbidden.

The publisher does not give any warranty express or implied or make any representation that the contentswill be complete or accurate or up to date. The accuracy of any instructions, formulae and drug dosesshould be independently verified with primary sources. The publisher shall not be liable for any loss,actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directlyor indirectly in connection with or arising out of the use of this material.

Page 2: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

Presentation and validation of theRadboud Faces Database

Oliver Langner, Ron Dotsch, Gijsbert Bijlstra, and Daniel H. J. WigboldusRadboud University Nijmegen, Nijmegen, The Netherlands

Skyler T. HawkUniversity of Amsterdam, Amsterdam, The Netherlands

Ad van KnippenbergRadboud University Nijmegen, Nijmegen, The Netherlands

Many research fields concerned with the processing of information contained in human faces wouldbenefit from face stimulus sets in which specific facial characteristics are systematically varied whileother important picture characteristics are kept constant. Specifically, a face database in whichdisplayed expressions, gaze direction, and head orientation are parametrically varied in a completefactorial design would be highly useful in many research domains. Furthermore, these stimuli shouldbe standardised in several important, technical aspects. The present article presents the freelyavailable Radboud Faces Database offering such a stimulus set, containing both Caucasian adult andchildren images. This face database is described both procedurally and in terms of content, anda validation study concerning its most important characteristics is presented. In the validation study,all frontal images were rated with respect to the shown facial expression, intensity of expression,clarity of expression, genuineness of expression, attractiveness, and valence. The results show veryhigh recognition of the intended facial expressions.

Keywords: Face database; Validation; Emotion; Gaze direction.

Face processing may well be one of the most

complex tasks that human beings accomplish. Faces

carry a wealth of social information, including

information about identity, emotional and motiva-

tional status, lip speech, and focus of attention as

indicated by eye gaze, all of which are important for

successful communication. Not surprisingly, face

images are commonly used as stimulus materials in

a wide range of research fields such as, for instance

(the development of) face processing, emotion

research, information processing in phobias and

autism, social referencing, interpersonal attraction,

persuasive communication, person memory, im-

pression formation, human and computer face

Correspondence should be addressed to: Oliver Langner, Behavioural Science Institute, PO Box 9104, NL-6500 HE

Nijmegen, The Netherlands. E-mail: [email protected]

The Radboud Faces Database was financed by the Behavioural Science Institute of the Radboud University Nijmegen. Skyler

T. Hawk is currently a postdoctoral researcher at the Research Centre Adolescent Development, Utrecht University.

We thank Job van der Schalk for his help as a FACS specialist.

COGNITION AND EMOTION

2010, 24 (8), 1377�1388

1377# 2010 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business

http://www.psypress.com/cogemotion DOI:10.1080/02699930903485076

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 3: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

recognition, and robotics (e.g., Calder & Young,2005; Grossmann & Johnson, 2007; Rule,Ambady, Adams & Macrae, 2008; Schwaninger,Wallraven, Cunningham, & Chiller-Glaus, 2006).

Different fields naturally put varying demandson the specific characteristics of the stimulusmaterials they use. For example, studies on facialexpression processing mainly use straight-gazeimages with standardised expressions (e.g.,Vuilleumier, Armony, Driver, & Dolan, 2003),whereas research on facial attention cues usesface images with varying gaze directions andhead orientations (e.g., Loomis, Kelly, Pusch,Bailenson, & Beall, 2008). Research on thedevelopment of face-processing abilities requiressimilar stimuli based on children’s faces (e.g.,Guyer et al., 2007). Furthermore, recent researchindicates that interactions between different facial-stimulus characteristics yield additional socialinformation. For example, it has been found thatgaze direction affects the perception of emotions(Adams & Kleck, 2005), and that attractive facesenhance evaluations of associated products whentheir gaze is directed towards the participant butnot when it is averted (Strick, Holland, & vanKnippenberg, 2008). Unfortunately, researchersoften rely on different stimulus sets, ad hocassembled stimuli from the internet, or even facesmore or less successfully edited by image software,because the currently available facial databasescontain only manipulations of specific stimuluscharacteristics and typically not of combinationsof characteristics. Moreover, the facial stimuliused in research tend to vary substantially interms of technical features and overall technicalquality. As a result, diverging findings fromdifferent studies may be attributable to variationsin the facial or technical characteristics of thestimulus sets used.

In our view, a face database with parametricvariations on a number of important facial char-acteristics constitutes an important extension ofthe currently available stimulus materials. Crucialcharacteristics, with a wide applicability in differ-ent fields of research, are facial expression, gazedirection, and head orientation, as well as a reason-able number of male and female models of both

adults and children. Furthermore, the stimuliincluded in this database should be controlled forpotentially interfering technical factors like posi-tions of facial landmarks, lighting conditions, andimage background. See Table 1 for an overview ofoften used databases and their features.

In the present article, we present the freelyavailable Radboud Faces Database (RaFD), a facedatabase containing Caucasian face images thatvary along all previously mentioned characteristicsand provide adequate control of the indicatedtechnical factors. All models in the dataset showeight facial expressions with three gaze directions,photographed simultaneously from five differentcamera angles. The photos were taken in a highlycontrolled environment. All displayed facial ex-pressions were based on prototypes from the FacialAction Coding System (FACS; Ekman, Friesen, &Hager, 2002a). As an important addition, thedataset contains images of both adult and childmodels, all with the same stimulus characteristicsand the same level of technical control.

In the following report, we present validationdata for all frontal camera images. For thevalidation, 276 participants rated all images ofnine randomly chosen models from the database.For each image, participants rated the depictedfacial expression, and the intensity, clarity, valenceand genuineness of the expression. Further, theyrated the attractiveness of the neutral frontal gazeimages for all nine models. This enables anassessment of the quality of the dataset. Further-more, researchers can use these ratings to selectimages from the dataset with the specific proper-ties they require. Mean ratings for all individualimages and rating dimensions are available asonline support material.

DEVELOPMENT OF DATABASE

Description of the image set

The database contains portrait images of 49models in two subsets: 39 Caucasian Dutch adults(19 female), and 10 Caucasian Dutch children (6female). All models showed eight facial expressionswith three gaze directions (see Figure 1a and b).

LANGNER ET AL.

1378 COGNITION AND EMOTION, 2010, 24 (8)

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 4: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

Expressions were neutral, anger, sadness, fear,disgust, surprise, happiness, and contempt, theexpressions most consistently recognised acrosscultures (Ekman, 1992). Each expression wasshown with eyes directed straight ahead, avertedto the left, and averted to the right. Photos weretaken against a uniform white background fromfive different camera angles simultaneously, withviewpoints from left to right in steps of 458 (seeFigure 1c). This amounts to 120 images permodel. Models wore black t-shirts, had no hairon the face and wore no glasses, makeup orjewellery. The targeted emotional expressionswere based on prototypes defined in the Investi-

gator’s Guide for the Facial Action Coding System

(Ekman, Friesen, & Hager, 2002b). The action

units we targeted, using a variation of the DirectedFacial Action Task (e.g., Ekman, 2007), are shownin Figure 2.

The photo shoot took place in early 2008 atRadboud University Nijmegen. Beforehand,models practiced all emotional expressions athome for at least one hour, following a detailedtraining manual. Throughout the photo shoot,two certified FACS specialists coached allmodels. During the session, each model firstpractised all expressions with a FACS specialistfor 25 minutes. The actual photo shoot tookabout 45 minutes for each model, during whichone of the FACS specialists monitored theexpressions on a TV screen while giving detailedinstructions for each expression.

Table 1. Examples of existing face databases and their features

Name of set Authors Models Features

JACFEE/JACNeuF Matsumoto & Ekman (1988) 56 Female and male models

Japanese and Caucasian models

Seven emotions and neutral expression

All images frontal gaze and 908 camera

KDEF Lundqvist, Flykt, & Ohman (1998) 70 Female and male models, all Caucasian

Six emotions and neutral expression

Frontal gaze

Five camera angles

MSFDE Beaupre, Cheung, & Hess (2005) 12 Female and male models

French Canadian, Chinese, and sub-Saharan

African models

Six emotions and neutral expression

All images frontal gaze and 908 camera

IAPS Lang, Bradley, & Cuthbert (1999) * Female and male models

Different ethnicities

Neutral, sad, angry images in various contexts

ADFES van der Schalk, Hawk, & Fischer (2009) 20 Female and male models

Caucasian and Turkish/Moroccan models

Nine emotions, dynamic expressions

Two gaze directions

Facial Expression

Subset

Hawk, van Kleef, Fischer, & van der

Schalk (2009)

8 Female and male models

Nine emotions and neutral expression

Dynamic expressions

All frontal gaze and 908 camera

NimStim Tottenham et al. (1998) 45 Female and male models

Caucasian, Latin American, African American, and

Asian American models

Seven emotions and neutral expression

Frontal gaze

VALIDATION OF RAFD

COGNITION AND EMOTION, 2010, 24 (8) 1379

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 5: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

Apparatus

We used five Nikon cameras (models D200, D2X,

and D300), with resolutions between 10 and 12 Mpx.

Three 500 W flashes (Bowens Int., Essex) were used

for illumination. All cameras and flashes were con-

nected to a wireless remote control (Pulsar, Bowens

Int., Essex), allowing us to take photos on all cameras

simultaneously (see Figure 3 for the technical setup).

Image processing

All photos were initially stored in raw format. Photos

were converted to tiff-image format and corrected

for white-balance by using the free software packages

UFRaw and The Gimp. Next, all images were

spatially aligned according to facial landmarks using

Matlab (The Mathworks Inc., Natick, MA). We

used a simplex optimisation algorithm for thispurpose, fitting two translational, and one rotationalalignment parameters. First, we aligned each imageof a model to the neutral straight gaze image of thesame model and camera angle. Then, all neutralstraight gaze images of the different models werealigned towards each other. Finally, both within-and between-models alignment parameters wereapplied to the images, and all aligned images werecropped and resized to a size of 1024�681 pixels.

VALIDATION OF DATABASE

Method

Participants and apparatus. A total of 276 stu-dents (238 female) from the Radboud University

Figure 1. Examples from the Radboud Faces Database. (a) Examples for the eight emotional expressions. From top left: sad, neutral, angry,

contemptuous, disgusted, surprised, fearful, happy. (b) Examples for the three gaze directions. (c) Examples for the five camera angles. Here

cameras in the order: 1808, 1358, 908, 458, 08. [To view this figure in colour, please visit the online version of the paper.]

LANGNER ET AL.

1380 COGNITION AND EMOTION, 2010, 24 (8)

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 6: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

Nijmegen participated in the validation study,

with a mean age of 21.2 years (SD 4.0). All

had normal or corrected-to-normal vision and

received t10 or course credits for participation.

The validation study was programmed using

PsychoPy (Peirce, 2007), a python library for

conducting psychological experiments.

Procedure. Only the frontal view images (908-camera) were validated. Participants were pre-

sented with pictures from only one of the database

subsets, either adults or children. For each model,

gaze direction, and facial expression, originally

two images were present in the validation stimulus

set. From these two only the one with the highest

inter-rater agreement concerning the intended

expression was retained for inclusion in the final

database.The validation started with an attractiveness

rating. Participants scored the neutral, straight-

gaze images of all subset models on a 5-point scale,

ranging from unattractive to attractive. Image

order was randomised. This task familiarised the

participants with all models in the subset.Next, participants rated the images of 9 subset

models on several dimensions. For each model,

participants viewed images with all three gaze

directions combined with eight emotional expres-

sions, summing up to 216 images for each

participant. This way, participants saw equal

numbers of emotions from each model. Which

models were presented was chosen randomly

across participants, with the constraint that every

image was rated by at least 20 participants.In each trial, a randomly chosen image from

the 9 subset models presented to the participant

was shown in the centre of the screen. For each

image, participants successively judged: (a) the

depicted expression; (b) the intensity of the

expression; (c) the clarity of the expression; (d)

the genuineness of the expression; and (e) the

valence of the image, in this order. Before the

task, participants got instructions for each rating

dimension. For each judgement, the current

Figure 2. Targeted action units (AU) for all emotional expressions. [To view this figure in colour, please visit the online version of the

paper.]

VALIDATION OF RAFD

COGNITION AND EMOTION, 2010, 24 (8) 1381

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 7: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

rating dimension was indicated above the image(e.g., ‘‘Clarity’’) and the corresponding rating scalewas displayed below it. The expression rating was

forced-choice with nine response categories, i.e.,the eight expressions used in the dataset and‘‘other’’ (Frank & Stennett, 2001). We askedparticipants to pick the emotion label that bestfitted the shown facial expression. The ordering of

the expression labels from which participantscould choose was counterbalanced across partici-pants, but kept constant within participants. Allother dimensions were rated on 5-point scales.We instructed participants to rate the emotional

expression of the shown face with regard to theintensity (‘‘weak’’ to ‘‘strong’’), the clarity (‘‘un-clear’’ to ‘‘clear’’), and the genuineness (‘‘faked’’ to‘‘genuine’’) of the expression. Finally, we asked

participants to judge the overall valence of theimage (‘‘negative’’ to ‘‘positive’’).

Results

Each image was rated at least 20 times on eachrating dimension. For the measures of expression,intensity, clarity, genuineness, and valence, sepa-rate analyses of variance (ANOVAs) were com-puted with the factors Subset (children, adults),Gender (male, female), Expression (neutral, an-ger, sadness, fear, disgust, surprise, happiness,contempt), and Gaze Direction (frontal, left,right). Due to the high number of images, moststatistical tests were significant. Therefore onlyeffects with an h2

p� .10 are reported, corre-sponding to pB .01. Means and SDs of allmeasures are shown for each subset and expression

Figure 3. Technical setup of the photo shoot.

LANGNER ET AL.

1382 COGNITION AND EMOTION, 2010, 24 (8)

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 8: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

in the appendix. Indices for inter-rater relia-bility are reported in Table 2 for all appropriaterating dimensions.1 Further, on-line supportingmaterials containing average validation data ofindividual images are available. Please go towww.rafd.nl.

Attractiveness. Mean attractiveness ratings (SDs)were 2.36 (0.53) and 2.10 (0.58) for the femaleand male adult models, respectively; and 2.42(0.40) and 2.44 (0.63) for the female and malechild models, respectively.

Expression. For each image, we calculated howmany participants chose the targeted emotion.Overall the agreement between chosen andtargeted emotions was 82% (median 88%, SD19%). Average choice rates per targeted emo-tion are depicted in Figure 4. An ANOVA ofthe arcsine-transformed agreement rates (Winer,1971) revealed a significant effect of expression,F(7, 1080)�168.2, pB .01, h2

p� .52. Post hoctests showed that agreement was significantlyhigher for happiness (mean 98%, SD�3%), andsignificantly lower for contempt (mean 50%,SD�15%), compared to all other expressions(means between 80 and 90%).

A close look at Figure 4 reveals that, for someexpressions, off-diagonal responses were notequally distributed across chosen expressions:Faces with intended surprise were sometimesconfused with fear (7%), and, vice versa: intendedfear was sometimes confused with surprise (8%);intended disgust was sometimes mistaken foreither anger (7%) or contempt (8%); and forintended contempt participants responded fre-quently either other (24%) or neutral (12%).

As the above-reported agreement rates do nottake response bias into account, we additionallydetermined and analysed unbiased hit rates(Wagner, 1993). Unbiased hit rates can varybetween 0 and 1, where a hit rate of 1 indicatesnot only that a stimulus category (e.g., happyfaces) was always identified correctly (e.g., thathappy faces were always categorised as happy), butadditionally that the corresponding response (e.g.,response ‘‘happy’’) was always used correctly (e.g.,that the response ‘‘happy’’ was only given forhappy faces). Lower unbiased hit rates result ifeither stimuli from a category are not classifiedcorrectly (e.g., happy faces categorised as angry)or if the corresponding response was also usedfor stimuli from other categories (e.g., that theresponse ‘‘happy’’ was also given to angry orsurprised faces). Unbiased hit rates were com-puted as follows: Per participant and for each gazedirection, we first created a choice matrix withtargeted and chosen expressions as rows andcolumns, respectively. Next, the number of ratingsin each cell was squared and divided by theproduct of the marginal values of the correspond-ing row and column, yielding the unbiased hitrate. As usual for proportions, those were arcsinetransformed prior to analysis (Winer, 1971),although for readability untransformed propor-tions are reported. A repeated-measures AN-OVA, with Subset as a between-subjects factorand Model Gender, Gaze, and Expression aswithin-subjects factors yielded significant effectsof Model Gender, F(1, 274)�52.2, pB .01,

Table 2. Intraclass correlations for all rating dimensions (but the

emotion rating)

Dimension ICC(1, 1) ICC(1, k)

Dutch adult

Attractiveness .31 .99

Intensity .20 .83

Clarity .19 .83

Genuineness .13 .75

Valence .44 .94

Dutch children

Attractiveness .24 .94

Intensity .26 .88

Clarity .22 .85

Genuineness .09 .67

Valence .48 .95

1 Here we report the intraclass correlations ICC(1, 1) and ICC(1, k) as reliability indices (Shrout & Fleiss, 1979). Due to the

fact that our participants did not rate the whole image set but only parts, we could not calculate the usually higher indices ICC(2, 1)

and ICC(2, k) that partial between-rater variance out. Therefore, our reliability indices are probably lower than the real reliability.

VALIDATION OF RAFD

COGNITION AND EMOTION, 2010, 24 (8) 1383

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 9: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

h2p� .16; and of Expression, F(7, 1918)�238.9,

pB .01, h2p� .47. Unbiased hit-rates were higher

for female than male models, with means (SDs) of

73% (27%) and 69% (27%), respectively. Similar

to the agreement analysis, post hoc tests revealed

significantly higher hit rates for happiness (mean

79%, SD�34%) and lower hit rates for contempt

(mean 29%, SD�31%) than all other expressions

(means between 56 and 65%).

Other validation measures. For each image, wecalculated the mean judgments for clarity, inten-sity, genuineness, and valence. For all measures,

ANOVAs showed similar significant expression

effects, F(6, 945)]101.8, p5 .01, h2p] .39. The

pattern of results is described for the different

validation measures more closely below.

The means of the ratings on the four judg-mental dimensions listed above are displayed inthe appendix. As can be seen in the appendix, thepatterns of means of the judgements of thedifferent expressions are remarkably similar acrossthe Adult and Child datasets. Note that both foradults and children, the ratings on intensity andclarity were highly correlated (r� .75), suggestingthat the more intense an expression was, the moreclear it appeared to be. For instance, happiness,fear, surprise and disgust are both relatively intenseand clear, while specifically contempt seems toscore low on both measures. Whether this reflectssomething about the inherent qualities of theseexpressions, or about the way in which our modelsexpressed them, cannot be assessed on the basis ofthe present data.

Figure 4. Percentage of chosen emotions per intended emotional expression. [To view this figure in colour, please visit the online version of

the paper.]

LANGNER ET AL.

1384 COGNITION AND EMOTION, 2010, 24 (8)

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 10: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

The dimension of genuineness appears to berelatively independent from both intensity andclarity, with correlations of .10 and .24, respec-tively. Both for the Adult and the Child set,neutral (respectively, M�3.9, SD�0.3; andM�3.7, SD�0.3) and happy faces (respectively,M�3.8, SD�0.5; and M�3.5, SD�0.4) werescored as fairly genuine, while all other expres-sions scored on average around the nominalmidpoint of the scale (i.e., around 3). Finally,the valence of virtually all expressions was rated asexpected, with happiness as the only clearlypositive expression (M�4.2, SD�0.3 in bothsets). Neutral turned out to be truly neutral (Adultset: M�3.0, SD�0.3; Child set: M�3.0, SD�0.2), while surprise was rated fairly close to neutral(M�2.8, SD�0.2 in both sets). All otheremotions but contempt were clearly negative(1.95M52.2), while contempt was slightly lessnegative than the other negative emotions (Adultset: M�2.5, SD�0.2; Child set: M�2.4,SD�0.3).

DISCUSSION

In this paper we have introduced the RadboudFaces Database, a new database of Caucasian faceswith parametric variations of the stimulus char-acteristics facial expression, gaze direction, andhead orientation. RaFD contains images of bothadult and child models with the same character-istics and control over technical aspects. We havefurther reported validation data for all frontal viewimages.

The overall 82% agreement rate found betweenintended and chosen expressions was high, lyingabout 11% beyond that reported in a recentvalidation study of the Karolinska Directed Emo-tional Faces database (KDEF; Goeleven, DeRaedt, Leyman, & Verschuere, 2008; Lundqvist,Flykt, & Ohman, 1998). The higher median of88% indicates a skewed distribution of agreementvalues, with more than 76% of images having anagreement ]80%. This underscores the effective-ness of the coaching method employed in elicitingreliable and prototypical facial expressions.

Notably, expression agreement was substan-tially lower for contempt than for all otherexpressions. Contempt was also the only expres-sion for which participants frequently chose theresponse option ‘‘other’’. Although errors duringthe coaching process for contempt are a possibleexplanation for this, studies have shown thatcontempt is a less universally recognised expres-sion across cultures (Ekman & Friesen, 1986;Rosenberg & Ekman, 1995), and that it is subjectto dialect-like variations in muscle activations(Elfenbein, Beaupre, Levesque, & Hess, 2007).Further, Matsumoto and Ekman (2004) foundthat lower agreement for the expression of con-tempt reflects problems with the expression labelinstead of problems with the expression itself.Taken together, this suggests that low agreementon contempt may be a general feature of theemotion, and not of the presented database.

For surprise, fear, and disgust we found sys-tematic patterns of deviating choices. For all threeexpressions, the most frequently chosen alternativeexpressions contained morphological overlaps withthe intended emotion. These patterns corre-sponded to those found by Goeleven et al. (2008).

RaFD is a new tool for research using facestimuli. It provides a parametric set of face imagesvaried along important facial characteristics,namely expression, gaze direction, and headorientation. Clearly, there are other facial char-acteristics important for face processing, likesymmetry of faces, masculinity/femininity, distinc-tiveness, or babyfaceness. Although these have notbeen systematically varied in the current database,future studies should gather data regarding thesecharacteristics to broaden the applicability ofRaFD even further. It should be noted that thereis always a trade-off between experimental controland ecological validity of stimuli, and that re-searchers need to be aware that probably not allfactorial combinations of facial characteristicsappear equally often in natural environments.

RaFD further contains images from bothadults and children, thus being one of the firstdatabases also providing high-quality stimuli foruse in developmental research. Importantly, theRaFD is freely available for use in scientific

VALIDATION OF RAFD

COGNITION AND EMOTION, 2010, 24 (8) 1385

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 11: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

research. More information about the image setand the application procedure can be found atwww.rafd.nl.

Manuscript received 3 June 2009

Revised manuscript received 26 October 2009

Manuscript accepted 3 November 2009

First published online 6 April 2010

REFERENCES

Adams, R. B., & Kleck, R. E. (2005). Effects of directand averted gaze on the perception of faciallycommunicated emotion. Emotion, 5(1), 3�11.

Beaupre, M. G., Cheung, N., & Hess, U. (2000). The

Montreal set of facial displays of emotion [Slides].(Available from Ursula Hess, Department of Psy-chology, University of Quebec at Montreal, Mon-treal, Quebec, Canada.)

Calder, A. J., & Young, A. W. (2005). Understandingthe recognition of facial identity and facial expres-sion. Nature Reviews Neuroscience, 6(8), 641�651.

Ekman, P. (1992). Are there basic emotions? Psycholo-

gical Review, 99(3), 550�553.Ekman, P. (2007). The directed facial action task.

In Handbook of emotion elicitation and assessment

(pp. 47�53). Oxford, UK: Oxford University Press.Ekman, P., & Friesen, W. V. (1986). A new pan-

cultural facial expression of emotion. Motivation and

Emotion, 10(2), 159�168.Ekman, P., Friesen, W. V., & Hager, J. C. (2002a).

Facial Action Coding System: The manual. Salt LakeCity, UT: Research Nexus.

Ekman, P., Friesen, W. V., & Hager, J. C. (2002b).Facial Action Coding System: Investigators guide. SaltLake City, UT: Research Nexus.

Elfenbein, H. A., Beaupre, M., Levesque, M., & Hess,U. (2007). Toward a dialect theory: Culturaldifferences in the expression and recognition ofposed facial expressions. Emotion, 7(1), 131�146.

Frank, M. G., & Stennett, J. (2001). The forced-choiceparadigm and the perception of facial expressions ofemotion. Journal of Personality and Social Psychology,80(1), 75�85.

Goeleven, E., De Raedt, R., Leyman, L., &Verschuere, B. (2008). The Karolinska DirectedEmotional Faces: A validation study. Cognition and

Emotion, 22(6), 1094�1118.

Grossmann, T., & Johnson, M. H. (2007). Thedevelopment of the social brain in human infancy.The European Journal of Neuroscience, 25(4),909�919.

Guyer, A. E., McClure, E. B., Adler, A. D., Brotman,M. A., Rich, B. A., Kimes, A. S., et al. (2007).Specificity of facial expression labeling deficitsin childhood psychopathology. Journal of Child

Psychology and Psychiatry, and Allied Disciplines,48(9), 863�871.

Hawk, S. T., van Kleef, G. A., Fischer, A. H., & vander Schalk, J. (2009). "Worth a thousand words’’:Absolute and relative decoding of nonlinguisticaffect vocalizations. Emotion, 9(3), 293�305.

Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1999).International Affective Picture System (IAPS): Tech-

nical Manual and Affective Ratings. Gainesville, FL:The Center for Research in Psychophysiology,University of Florida.

Loomis, J. M., Kelly, J. W., Pusch, M., Bailenson, J. N.,& Beall, A. C. (2008). Psychophysics of perceivingeye-gaze and head direction with peripheral vision:Implications for the dynamics of eye-gaze behavior.Perception, 37(9), 1443�1457.

Lundqvist, D., Flykt, A., & Ohman, A. (1998).The Karolinska Directed Emotional Faces*KDEF.Department of Clinical Neuroscience, Psychologysection, Karolinska Institutet.

Matsumoto, D., & Ekman, P. (1988). Japanese and

Caucasian Facial Expressions of Emotion (JACFEE)

[Slides]. San Francisco, CA: Intercultural andEmotion Research Laboratory, Department of Psy-chology, San Francisco State University.

Matsumoto, D., & Ekman, P. (2004). The relationshipamong expressions, labels, and descriptions of con-tempt. Journal of Personality and Social Psychology,87(4), 529�540.

Peirce, J. W. (2007). PsychoPy*Psychophysics soft-ware in Python. Journal of Neuroscience Methods,162(1�2), 8�13.

Rosenberg, E., & Ekman, P. (1995). Conceptual andmethodological issues in the judgment of facialexpressions of emotion. Motivation and Emotion,19(2), 111�138.

Rule, N. O., Ambady, N., Adams, R. B., Jr., & Macrae,C. N. (2008). Accuracy and awareness in theperception and categorization of male sexual orien-tation. Journal of Personality and Social Psychology,95(5), 1019�1028.

Schwaninger, A., Wallraven, C., Cunningham, D. W.,& Chiller-Glaus, S. D. (2006). Processing of facial

LANGNER ET AL.

1386 COGNITION AND EMOTION, 2010, 24 (8)

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 12: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

identity and expression: A psychophysical, physio-

logical, and computational perspective. Progress in

Brain Research, 156, 321�343.Shrout, P. E., & Fleiss, J. L. (1979). Intraclass

correlations: Uses in assessing rater reliability.

Psychological Bulletin, 86(2), 420�428.Strick, M., Holland, R. W., & van Knippenberg, A.

(2008). Seductive eyes: Attractiveness and direct

gaze increase desire for associated objects. Cognition,

106(3), 1487�1496.Tottenham, N. (1998). MacBrain Face Stimulus Set.

Chicago, IL: John D. and Catherine T. MacArthur

Foundation Research Network on Early Experience

and Brain Development.

Van der Schalk, J., Hawk, S. T., & Fischer, A. H.

(2009). Validation of the Amsterdam Dynamic Facial

Expression Set (ADFES). Poster for the International

Society for Research on Emotions (ISRE), Leuven,

Belgium.Vuilleumier, P., Armony, J. L., Driver, J., & Dolan,

R. J. (2003). Distinct spatial frequency sensitivities

for processing faces and emotional expressions.

Nature Neuroscience, 6(6), 624�631.Wagner, H. (1993). On measuring performance in

category judgment studies of nonverbal behavior.

Journal of Nonverbal Behavior, 17(1), 3�28.Winer, B. (1971). Statistical principles in experimental

design (2nd ed.). New York: McGraw-Hill.

VALIDATION OF RAFD

COGNITION AND EMOTION, 2010, 24 (8) 1387

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010

Page 13: Cognition & Emotion Presentation and validation of the ... · Knippenberg, Ad(2010) 'Presentation and validation of the Radboud Faces Database', Cognition & Emotion, 24: 8, 1377 —

APPENDIX

Averages (SDs) of agreement, unbiased hit-rates, and ratings of intensity, clarity,genuineness, and valence per expression and subset separately

Emotion

Measure Anger Contempt Disgust Fear Happiness Neutral Sadness Surprise

Adults

Agreement 81 (19) 48 (12) 79 (10) 88 (7) 98 (3) 84 (13) 85 (16) 90 (9)

Unb. Hit-Rate 64 (26) 66 (32) 73 (24) 89 (21) 67 (27) 66 (27) 75 (25)

Intensity 3.5 (0.4) 2.9 (0.3) 3.9 (0.3) 4.1 (0.3) 4.1 (0.4) 3.4 (0.3) 3.4 (0.4) 3.9 (0.3)

Clarity 3.7 (0.4) 2.9 (0.3) 3.9 (0.3) 4.0 (0.3) 4.4 (0.2) 3.6 (0.3) 3.7 (0.4) 4.0 (0.3)

Genuineness 2.9 (0.3) 3.2 (0.2) 3.1 (0.3) 3.0 (0.3) 3.8 (0.5) 3.9 (0.3) 2.9 (0.3) 3.0 (0.3)

Valence 2.0 (0.2) 2.5 (0.2) 2.0 (0.2) 2.1 (0.2) 4.2 (0.3) 3.0 (0.3) 2.1 (0.2) 2.8 (0.2)

Children

Agreement 89 (14) 59 (19) 83 (18) 79 (13) 97 (6) 84 (16) 75 (25) 91 (8)

Unb. Hit-Rate 61 (23) 64 (26) 55 (25) 91 (14) 69 (25) 63 (25) 71 (22)

Intensity 3.8 (0.5) 3.1 (0.3) 4.1 (0.3) 4.2 (0.4) 3.9 (0.5) 3.2 (0.3) 3.5 (0.4) 4.2 (0.2)

Clarity 3.9 (0.5) 3.1 (0.4) 3.9 (0.4) 4.0 (0.4) 4.2 (0.4) 3.4 (0.4) 3.6 (0.4) 4.1 (0.2)

Genuineness 3.0 (0.3) 3.1 (0.3) 3.2 (0.3) 3.2 (0.3) 3.5 (0.4) 3.7 (0.3) 3.0 (0.4) 3.2 (0.3)

Valence 1.9 (0.2) 2.4 (0.3) 1.9 (0.2) 2.2 (0.2) 4.2 (0.3) 3.0 (0.2) 2.0 (0.2) 2.8 (0.2)

LANGNER ET AL.

1388 COGNITION AND EMOTION, 2010, 24 (8)

Downloaded By: [Radboud University Nijmegen] At: 09:32 24 November 2010