RESEARCHARTICLE The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects AnjuliL.A.Barber*,DaniaRandi,CorsinA.Müller,LudwigHuber ComparativeCognition,MesserliResearchInstitute,UniversityofVeterinaryMedicineVienna,Medical UniversityofViennaandUniversityofVienna,Vienna,Austria *[email protected] a11111 Abstract Fromallnon-humananimalsdogsareverylikelythebestdecodersofhumanbehavior.In additiontoahighsensitivitytohumanattentivestatusandtoostensivecues,theyareable todistinguishbetweenindividualhumanfacesandevenbetweenhumanfacialexpres- sions.However,sofarlittleisknownabouthowtheyprocesshumanfacesandtowhat OPENACCESS extentthisisinfluencedbyexperience.Herewepresentaneye-trackingstudywithdogs emanatingfromtwodifferentlivingenvironmentsandvaryingexperiencewithhumans:pet Citation:BarberALA,RandiD,MüllerCA,HuberL (2016)TheProcessingofHumanEmotionalFaces andlabdogs.Thedogswereshownpicturesoffamiliarandunfamiliarhumanfaces byPetandLabDogs:EvidenceforLateralizationand expressingfourdifferentemotions.Theresults,extractedfromseveraldifferenteye-track- ExperienceEffects.PLoSONE11(4):e0152393. ingmeasurements,revealedpronounceddifferencesinthefaceprocessingofpetandlab doi:10.1371/journal.pone.0152393 dogs,thusindicatinganinfluenceoftheamountofexposuretohumans.Inaddition,there Editor:KunGuo,UniversityofLincoln,UNITED wassomeevidencefortheinfluencesofboth,thefamiliarityandtheemotionalexpression KINGDOM oftheface,andstrongevidenceforaleftgazebias.Thesefindings,togetherwithrecent Received:December15,2015 evidenceforthedog'sabilitytodiscriminatehumanfacialexpressions,indicatethatdogs Accepted:March14,2016 aresensitivetosomeemotionsexpressedinhumanfaces. Published:April13,2016 Copyright:©2016Barberetal.Thisisanopen accessarticledistributedunderthetermsofthe CreativeCommonsAttributionLicense,whichpermits unrestricteduse,distribution,andreproductioninany Introduction medium,providedtheoriginalauthorandsourceare Animportantperceptual-cognitivechallengeforalmostallanimalsistofindacompromise credited. between'lumpingandsplitting'whenconfrontedwiththevastamountofinformationarriving DataAvailabilityStatement:Allrelevantdataare attheirsenses[1].Animalsneedtocategorize,i.e.,totreatanobjectasbelongingtoageneral withinthepaperanditsSupportingInformationfiles. class,butbesidescategorizationofobjects,theyfurtherneedtorecognize,i.e.,totreatdifferent Funding:Theresearchpresentedherewasfunded imagesasdepictingthesameobject,suchasaspecificface,despitechangesintheviewingcon- bytheViennaScienceandTechnologyFunds ditions.Thebrainofmanyanimalssolvesbothtasksinanatural,effortlessmannerandwith (WWTF),No.CS11-005;http://www.wwtf.at/)toLH. anefficiencythatisdifficulttoreproduceincomputationalmodelsandartificialsystems.Still, Thefundershadnoroleinstudydesign,data collectionandanalysis,decisiontopublish,or howthisisaccomplishedisfarfrombeingfullyunderstood. preparationofthemanuscript. Anenormouslyrichsourceofinformationandanimportantcategoryofvisualstimulifor animalsinallmajorvertebratetaxaistheface[2].Inmanyspeciesfacesconvey,amongother CompetingInterests:Theauthorshavedeclared thatnocompetinginterestsexist. things,informationaboutdirectionofattention,age,gender,attractivenessandcurrent PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 1/22 ProcessingofHumanEmotionalFacesbyDogs emotionoftheindividual(forreviewssee[3–5]).Almost50yearsagopsychologistshavedem- onstratedthatfacialexpressionsarereliablyassociatedwithcertainemotionalstates[6,7].This inturncanleadtobehavioraldecisionsofareceiverconcerninge.g.cooperation,competition, consolidationorformationofrelationships.Someprimates,forinstance,seemtohaveevolved specialabilitiesforreadingfacesduetotheircomplexsociallives(e.g.[8–10]). Whilethecategorization,interpretationandidentificationoffacesfromconspecificsis likelyaconsequenceofthesociallife,whichmayhaveresultedinneuralspecializationforfaces [11],itisafurtherchallengetoachieveatleastsomeoftheseabilitieswithfacesofheterospeci- fics.Theconfigurationoftheface,theunderlyingfacialmusclesandtheresultingexpressions aremoreorlessdifferentfromtheown,dependingonhowtaxonomicallydistanttheother speciesis[12,13].Nevertheless,avarietyofanimalshavebeenshowntobeabletoidentifyand categorizefacesandalsoemotionsofheterospecifics(e.g.macaques[14],sheep[15],horses [16]).Severalinvestigationsoncon-andheterospecificfaceprocessingsuggestlearnedaspects intheinformationtransferandspeed(c.f.faceexpertise,[17–22]).Herebyindividualsareable torecognizeanddiscriminatebestbetweenfacessimilartothosethataremostoftenseenin theenvironment.Forinstance,theinfluenceofindividualexperiencewiththeotherspecies wasshownforurbanlivingbirdslikemagpies[23]andpigeons[24].Especiallyearlyexposure tofacesfacilitatestheabilityoffacediscrimination[25,26].Althoughsomeindividualsareable tolearnaboutheterospecificfacesalsolaterinlife(e.g.chimpanzees[27],rhesusmacaques [28]),theywillnotreachthesamehighlevelofcompetence[12,21].Itremainsanopenques- tion,however,whethertheimprovedabilitiestoreadfacesduetoearlylifeexposurearecaused solelybyanacquiredearlysensitivityforfaces(innatemechanisms)orsimplybythelarger amountofexperience(learnedresponses). Anidealmodeltostudysuchquestionsaboutexperienceforheterospecificfacesisthe domesticdog.Petdogsliveinanenduringintimaterelationshipwithhumans,oftenfrom puppyageon.Asreviewedinseveralrecentbooks[29–32],dogshavedevelopedspecificsocio- cognitivecapacitiestocommunicateandformrelationshipswithhumans.Mostlikelycaused byamixtureofphylogenetic(domestication)andontogenetic(experience)factors,dogsliving inthehumanhouseholdshowhighlevelsofattentivenesstowardshumanbehavior(reviewed in[33]),followhumangestureslikenootheranimal,(e.g.[34]),exhibitahighsensitivityto humanostensivesignals,likeeyecontact,namecallingandspecificintonation[35]andshow anincreasedreadinesstolookatthehumanface[36].Incomparisontohand-rearedwolfpup- pies,thelatterisalreadypresentindogpuppies([37];seealso[38]).Bymonitoringhuman faces,dogsseemtoobtainacontinuousstreamofsocialinformationrangingfromcommuni- cativegesturestoemotionalandattentivestates.Evenifthisdoesnotmeanthatdogsareread- ersofourmindsbutonlyexquisitereadersofourbehavior[39],theyevaluatehumansonthe basisofdirectexperiences[40]andaresensitivetowhathumanscansee(aformofperspec- tive-taking;reviewedbyBräuer[41]). Aspecialsensitivityforfacesindogsisfurthersupportedbytworecentfindings.AnfMRI studyinawakedogshasrevealedthataregioninthetemporalcortexisfaceselective(it respondstofacesofhumansanddogsbutnottoevery-dayobjects,[42]).Dogsalsohavebeen foundtoshowagazebiastowardsfaces[43,44].Still,thereismixedevidenceforthelateraliza- tiontowardsconspecificsandheterospecifics.Dogsshowedahuman-likeleftgazebiastoward humanfacesbutnottowardmonkeyordogfaces[44].Thisleftgazebiaswasexhibitedonly towardnegativeandneutralexpressions,however,butnottowardpositiveexpressionsof humanfaces[43].Ithasbeenproposedthattherighthemisphereisresponsiblefortheexpres- sionofemotionsandthattheassessmentofemotionsthereforeleadstoaleftgazebiastowards suchstimuli(righthemispheremodel,[45,46]).Incontrasttotherighthemispheremodel,the valencemodel[47,48]suggeststhatthelefthemispheremainlyprocessespositiveemotions PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 2/22 ProcessingofHumanEmotionalFacesbyDogs whiletherighthemispheremainlyprocessesnegativeemotions.Sofaravarietyofspecies(e.g. humans[49,50],apes[51],monkeys[52],sheep[53],dolphins[54],dogs[43,55–57],etc.)have beenshowntoshowlateralizationtowardsemotivestimuli(reviewedbyAdolphs[58]and Salva[59]). Thedog'sfollowingofthehumangazeand,moregenerally,thevisualscanningoffaces requiretheapplicationofadvancedtechnicaltools.Themostcommonlyusedequipmentin humanpsychophysicallaboratoriesistheeye-tracker[60,61].Besidesthegaze-followingstudy byTéglásandcolleagues[62],twostudieshavesofarusedthistechnologyfortheinvestigation ofthelookingbehaviorofdogs.Inakindofproof-of-conceptstudy,Somppiandcolleagues [63]showedthatdogsarelookingwithahigherfrequencytotheinformativepartsofapicture andthattheydirectmoreattentiontoconspecificsandhumanscomparedtoobjects.Inafol- low-upstudyaddressingthelookingatfaces,theresearcherscoulddemonstratethatdogspay moreattentiontotheeyeregioncomparedtotherestoftheface[64].Thefindingthatthe meanfixationswerelongerforconspecificcomparedtoheterospecific(human)facessuggests thatdogsprocessfacesinaspecies-specificmanner.Theuseoftheeye-trackerrevealedsubtle differencesinlookingpatternsbetweendogslivingindifferenthumanenvironments:petdogs andkenneldogs.Still,bothtypesofdogsshowedapreferencefortheownspecies,familiarpic- turesandtheeyeregion.Overall,thesefindingssuggestthatthelivingconditionsofdogsaffect thewaytheylookatus.However,itisnotclearwhetherthisisduetothelimitedtimeinclose relationshipwithhumansortothedevelopmentofhumanfaceprocessingduringearlylife. Thiseye-trackingstudyaddressesthisquestionbycomparingthelookingathumanfaces betweenpetandlabdogs.Forbothtypesofdogsthehumanenvironmentduringtheirwhole lifeisknown,especiallythedifferenceintheamountofexposuretohumansandthequalityof thehuman-dogrelationship.Afurtherwaytoinvestigatetheroleofindividualexperiencewith humanfacesistocomparethedogs'lookingatfamiliarandunfamiliarfaces.Itisalready knownthatdogsareabletodiscriminatebetweenfamiliarandunfamiliarfaces,eitherimplic- itlybyexhibitinglookingpreferences[65,66]orbymakingthediscriminationexplicitinatwo- choicetask[67].However,itisnotyetknownifdogsshowpreferencesandrespondtodiffer- encesbetweenfamiliarandunfamiliarfacesonlyifforcedtodosoorifdogsspontaneously scanfamiliarfacesandunfamiliarfacesdifferently. Afurtherinterestingbutnotyetinvestigatedinfluenceonthedog'slookingpatternat humanfacesisthefacialexpressionasaresultofthehuman'semotion.Recentlywecould show—confirmingandextendingthefindingsofNagasawaandcolleagues[68]–thatdogscan learntodistinguishbetweentwofacialexpressionsofthesameunfamiliarhumanpersonsand togeneralizethisabilitytonovelfaces[69].Importantly,ourstudysuggestedthisabilityisnot dependentontheexploitationofsimplefeatureslikethevisibilityofteethbutratheronthe memoryofhowdifferenthumanfacialexpressionslooklike.Thesememoriesarelikelybeing formedineverydayinteractionswithhumans,especiallywiththedogs'humanpartner(s). Therefore,inadditiontopossibleeffectsofexperience(petdogsvs.labdogs,familiarfaces vs.unfamiliarfaces),thisstudyfocussesonthequestiononhowdogsprocessemotionalfaces (angry,happy,sad,andneutral).Inlinewithapreviouseye-trackingstudy[64],weexpected thatlabdogs,duetotheirlimitedexperiencewithhumans,wouldshowlongerlatencies, shorterfixationdurationsandatendencytofixatemoreonfamiliarfacescomparedtopet dogs.Furtherwehypothesizedthatthereisadifferencebetweentheprocessingofafamiliar andunfamiliarfaceastheemotionalexpressionsofafamiliarfaceshouldbeprocessedfaster duetopreviousexperiencewiththeface.Basedonthefindingsofourrecentstudy[69],we expectedthatthefouremotionalexpressionsareprocesseddifferentlyandelicitadifferent amountofattention(i.e.differingamountoffixationsandfixationdurations).Additionally,in thispreviousstudydogsrequiredmoretrainingtodiscriminatehappyandangryfacesifthey PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 3/22 ProcessingofHumanEmotionalFacesbyDogs hadtotouchtheangryface.Thisisincontrasttodogsthathadtotouchthehappyface.The resultsindicatethattheyavoidedapproachingtheangryfaces.Thereforeweexpectedthatthey fixatefacesthatareexpressingnegativeemotionsshorterandlessoften.Concerningthescan- ningpattern,weexpectedthatmoreinformativepartsoftheface(eyeandmouthregion)are fixatedmorefrequently.Finally,weexpectedthedogstoshowagazebiastowardsthefaces. Methods AllexperimentalprocedureswereapprovedinaccordancewithGPSguidelinesandnational legislationbytheEthicalCommitteefortheuseofanimalsinexperimentsattheUniversityof VeterinaryMedicineVienna(Ref:09/08/97/2012).Alldogownersgavewrittenconsenttopar- ticipateinthestudy.Theindividualinthefiguresofthismanuscripthasgivenwritten informedconsent(asoutlinedinPLOSconsentform)topublishthesecasedetails. Subjects Nineteenprivatelyownedpetdogsandeightlaboratorydogsparticipatedinthestudy(see Table1).Thepetdogswereofvariousbreedsandtheiragerangedbetween1–12years.Dog ownersofthepetdogsspentonaverage27h/weekactivelywiththeirdogs(walking,playing andtraining).Withtheexceptionoftwodogs,alldogswereatleastonetotwotimesaweek activeindogsportsactivitieslikeagility,dogdance,assistantdogtraining,mantrailingetc. Thetworemainingdogshadonlytrainingasa“Begleithund”andobediencetrainingona dailybasisbytheirowners.Theirownersgavewrittenconsenttoparticipateinthestudy.The laboratorydogswereallbeagles,aged1–5years,andwerehousedonthecampusoftheUniver- sityofVeterinaryMedicineViennainpacksofeither13(ClinicalUnitofInternalMedicine SmallMammals(CU-IM),L1-L6inTable1)or4dogs(ClinicalUnitofObstetrics,Gynecology andAndrology(CU-OGA),L7andL8inTable1).Noneofthelaboratorydogseverhaddone anyactivityindogssportoreverparticipatedinprofessionalobediencetraining.Theywere bornattheClinicalUnitofObstetrics,GynecologyandAndrologyattheUniversityofVeteri- naryMedicineVienna.Afterweaningofffromthemotherattheageoftwomonth,thedogs werelivinginpacks.Thehousingofbothgroupsconsistedofanindoorandoutdoorenclosure (CU-IM:indoorenclosure=64qm,outdoorenclosure=248qm;CU-OGA:indoor enclosure=18qm,3outdoorenclosures=~70qm),inwhichtheycouldmovefreelyduringthe day.From22:00–6:00o´clockthedogswerekeptintheindoorenclosures.Theenclosures wereenrichedwithdogtoys(chewing,squeakingtoysorballs).Theircontacttohumanswas limitedtothedailyfeeding(onceadayinthemorning,wateradlibitum)andcleaningofthe enclosuresoftheanimalkeepers.Additionallytheywereparticipatinginpracticalcoursesfor thetrainingofstudentsofveterinarymedicine(approx.1–2timesasemester).However,none ofthedogswassocializedinaclassicalway,astheywerelivingonlywiththeirconspecifics,but notwithhumans. Experimentalsetup AllexperimentswereconductedattheCleverDogLaboftheUniversityofVeterinaryMedi- cine,Vienna.Theexperimentalroomwasdividedbyawall,consistingofaprojectionscreen (200x200cm)andtwodoors,intotwocompartments;asmallone(149x356cm)housingthe computersystemoperatingtheeyetrackerandavideoprojector(NECM300XS,NECDisplay Solutions,UnitedStates),andalargerone(588x356cm)withthechinrestdeviceandtheeye tracker(seeFig1).Thestimuliwereback-projectedontotheprojectionscreen(sizeprojection area110x80cm).WeusedtheeyetrackingsystemEyelink1000(SRResearch,Ontario,Can- ada)torecordmonoculardatafromthesubjects.Thissystemisidealfordogresearchbecause PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 4/22 ProcessingofHumanEmotionalFacesbyDogs Table1. Listofparticipatingdogs. DogID(a) Breed Age(b) Sex(c) Status ValidSessions ValidTrials P1 AustralianKelpie 1 m neutered 4 59 P2 AustralianShepherd 1 m intact - - P3 AustralianShepherd 2 f neutered 2 14 P4 AustralianShepherd 3 m intact 5 56 P5 BerneseMountainDog 2 f neutered 2 27 P6 BergerBlancSuisse 5 m intact 2 15 P7 BorderCollie 1 m intact 5 76 P8 BorderCollie 6 m neutered 4 60 P9 BorderCollie 8 m neutered 5 57 P10 GermanShepherd 6 f neutered 4 68 P11 HungarianViszla 3 m neutered 5 60 P12 IrishWolfshound 1 f intact 3 31 P13 Mongrel(AmericanBulldog-Mix) 11 m neutered 3 37 P14 Mongrel(Beauceron-Husky-Mix) 3 m neutered 4 61 P15 Mongrel(Bracke-Shepherd-Mix) 8 m neutered 4 42 P16 Mongrel(Mudi-Schnauzer-Mix) 2 f neutered 5 65 P17 Mongrel(Spitz-Peke-Mix) 11 m neutered - - P18 StandardPoodle 6 m intact 3 51 P19 Puli 1 f intact 5 68 L1 Beagle 3 m neutered 4 62 L2 Beagle 4 m neutered 5 70 L3 Beagle 1 m neutered 5 75 L4 Beagle 4 m neutered 5 57 L5 Beagle 1 m neutered 5 69 L6 Beagle 3 m neutered 5 64 L7 Beagle 1 m intact 5 69 L8 Beagle 1 m intact 4 51 aP=petdog,L=labdog. bAgeinyears. cm=male,f=female. doi:10.1371/journal.pone.0152393.t001 Fig1.Experimentalsetup.(a)Lateralviewofchinrestdevice,projectionscreenandfeedingmachines seenfromtherearoftheroomduringtrainingwithgeometricalfiguresand(b)frontviewofthechinrest deviceincludingtheeye-trackingcamerawithIRilluminator. doi:10.1371/journal.pone.0152393.g001 PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 5/22 ProcessingofHumanEmotionalFacesbyDogs itcanbeusedwithachinrestorwithoutanyheadsupportandwitharemotecameraifhead fixationisnotdesirable,buthighaccuracyandresolutionarestillimportant.Weusedacus- tomizedchin-restdeviceforheadstabilization.Apillowwithav-shapeddepressionwas mountedonaframetoallowverticaladjustmentofthechinresttotheheightoftheindividual dog.Theframeconsistedofaluminiumprofiles(MayTecAluminiumSystemtechnikGmbH, Germany)thatallowedtheeasilyadjustablebutstablefixationofadditionalequipment(e.g. cameras).Thechinrestwaspositionedatadistanceof200cmfromtheprojectionscreen.The eye-trackingcamerawiththeinfraredilluminatorwasmountedonanextensionofthechin- restframe(seeFig1).Thisapparatuswasalignedhorizontallywiththechinrestatadistance of50cm.Lightconditionsintheroomwerekeptconstantlyat75luxusingLED-lightbulbs (9,5W,2700k,PhilipsGmbHMarketDACH,Germany). Stimuli Thestimulussetforeachsubjectconsistedofphotographsofthefaceofthedogs'owner(for thelabdogsthedogtrainer,whotrainedthemfortheexperimentaltests)andofanunfamiliar personexpressingfourdifferentemotions:neutral,happy,angryandsad(Fig2).Allowners andallunfamiliarhumanswerefemale.Thephotographsweretakenbyexperimenter1(E1)in astandardizedsetupattheCleverDogLabinfrontofawhitebackground.Foreachdogthe facesoftheownerservedasfamiliarstimuliandthefacesofanotherdog´sownerparticipating inthestudywereusedasunfamiliarstimuli.Fortheunfamiliarstimuli,apersonwaschosen whodidnotdifferstronglyfromtheownerinappearance(e.g.haircolor).Thestimuliwere projectedontothescreenwith1024x768pxresolution,resultingin900x850mmsizedpic- tureswhichcorrespondsto25.36°x23.99°ofvisualangle(head:approx.400x300mmcorre- spondingto11.42°x8.58°ofvisualangle). Thestimuliwerevalidatedconcerningtheiremotionalexpressionandvalenceafterwardsby 67neutralobserversviaanonline-questionnaire.Allparticipantswererecruitedviatheinternet (17male,50female,agerange16–55years).Theparticipantswereaskedforeverypictureto ratetheemotionangry,happy,neutralandsadonscalefrom0–100%(e.g.Picturexy=0% angry,0%happy,30%neutral,70%sad,thefournumbershadtoaddto100%).Foreverypic- turedataofallratingsforthecertainemotionswerepooledanditwasdecidedifthepicturewas identifiedcorrectly.Identificationofthecorrectemotionwasgivenwhenratedasthepredomi- nantlyshownemotion,higheratleast15%thanthesecondhighestratedemotion,e.g.ifan angrypicturewasratedwith62%angry,0%happy,23%neutraland13%saditwasconsidered ascorrectlyidentified.Incontrast,ifthesamepicturewasratedwith52%angry,0%happy,38% neutraland10%sadwouldnothavebeenconsideredascorrectlyidentified.Validationrevealed thatfortheangrycondition74%,happy95%,neutral100%andsad60%ofthepictureswere identifiedcorrectly.Thenegativeemotionswereusuallymistakenwithaneutralexpression (angryin80%ofthecases,sadin88%ofthecases).Incorrectratedpictureswereleftintheanal- ysisaswewantedthedogownerstoexpressemotionalexpressionsasnaturalaspossible.This resultedinexpressionswhichcouldbeambiguousfornon-informedobservers. Procedure Training. Priortotheexperiment,thedogsweretrainedfortheexperimentalprocedure (for2–6months).TrainingwasdonebyE1andE2.Weusedoperantconditioningwithaposi- tivereinforcement(foodrewards)andaclickerasasecondaryreinforcer.Thedogswere trainedonceortwiceaweek,dependingontheiravailability.Theownerofthedogwasallowed tostayintheroomduringthewholeprocedure.Shewaspositionedonachairatthebackof theroomandinstructedtoavoidinteractionwiththedog. PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 6/22 ProcessingofHumanEmotionalFacesbyDogs Fig2.Examplesoftheexperimentalstimuli. doi:10.1371/journal.pone.0152393.g002 Thetrainingconsistedofthreesteps.Inthefirststepthedogwastrainedtogotothechin restdeviceandtoresttheheadonthechinrestforatleast10sec(seeFig1a).Afterthedoghad learnedtostayinthechinrestitwasconfrontedwithatwo-choiceconditionaldiscrimination taskwithgeometricalfigures(GFs).Withthistaskweaimedtoincreasethedog’sattentionto thescreen(upto20sec)andtomakethetrainingmoreinteractive.Furthermore,thisstepwas includedtopreparethedogforthecalibrationprocedureduringwhichthedoghadtofollow thepositionof3dotsonthescreen(seebelow).Inthelasttrainingstep,picturesoflandscapes, animals,architectureandhumans(butnotofhumanfacesexpressingemotions)wereinter- spersedbetweentheGFstohabituatethedogtothepresentationofpicturesandtoincrease thedurationthedogcouldleaveitsheadmotionlessonthechinrestfurther.Thesepictures variedinsizebetween50x50and100x100cm.Additionallyweintroducedthepresentation ofsmallanimatedvideoswithsound(interspersedwithintheblocksofpictures)whichwere usedasattentiontriggersduringthetestphase.Trainingwasconsideredsuccessfulifthedog, in4outof5trials,stayedmotionlessonthechinrestwiththeirheadandorientedtowardsthe screenforatleast30secondswhilethestimuliwerepresented.Orientationwasdeterminedby theexperimenterviavisualinspectionaidedbytheon-the-flyoutputoftheeyetracker. Testing. TestingwasdonebyABandDR.Duringthewholeproceduretheownerwas allowedtostayatthebackoftheroomandwasinstructedtoavoidinteractionwiththedog.At thebeginningofatestsession,thedogwasallowedtoexploretheroomforapproximately5 min.Followingthat,theeyetrackingsystemwascalibratedwithathreepointcalibrationpro- cedure.Therebythreesmalldotswerepresentedoneaftertheotheronthescreen(sizeofthe dot:50mm,coordinates(x,y):(512,65),(962,702),(61,702))whilethedoghaditsheadonthe chinrest.Whenthedogfocusedonthedotforatleastasecond,thepointwasacceptedbyE1 ontheoperatingcomputer.Aftercalibrationofthethreepoints,avalidationoftheaccepted calibrationpointsfollowed.Forthispurposeallthreecalibrationpointswereshownagain.If thevalidationrevealedanaveragedeviationofmorethan3°ofvisualanglebetweenthetwo repetitions,thecalibrationwasrejectedandrepeateduntilthiscriterionwasmet.Calibration wasdonebeforeeverysession.Followingsuccessfulcalibration,thedogwaspresentedwiththe stimuliasfollows: 1. Beforethepresentationofeachpicture,anattentiontrigger(smallanimatedvideo)waspre- sentedonthescreen.Thepositionofthetriggerwasrandomlyassignedtooneofthefour cornersofthescreen(x,y:(150,150),(150,620),(875,150),(875,620)).Thesizeofthetrigger was5cmonthescreenwhichcorrespondsto1.43degreesofvisualangle.Thepresentation softwareoftheeyetracker(ExperimentalBuilder1.10.1241,SRResearchLtd.,Ontario, PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 7/22 ProcessingofHumanEmotionalFacesbyDogs Canada)wasprogrammedsothatpresentationofthestimulifollowedimmediatelyand automaticallyoncethedogwasfixatingonthetrigger. 2. Faceswerepresentedforfivesecondsinthecenterofthescreen(x,y:(512,382)). 3. Faceswerepresentedinblocksoffourtrials(eachtrialconsistedofthepresentationofa triggerfollowedbythepresentationofafacepicture).Theblockscontainedpicturesof eitheronlypositive(happyandneutral)oronlynegative(angryandsad)facialexpressions. Theorderofthepresentationwasrandomizedwithintheblock,buteveryblockcontained bothcorrespondingemotionalexpressions(eitherpositiveornegative)ofthefamiliarand theunfamiliarperson. Aftertheendofeachblock,thedogreceivedafoodreward.Dependingonthedog´smoti- vationandconcentrationashortpause(upto1min)wasmadebeforethedogwasaskedto repositionitselfinthechinrestdevice.Duringthispausethedogwasallowedtorestorto movefreelyintheroom. 4. Withinonesessionfourblocksofpictureswerepresented.Twoblockscontainedpositive andtwoblockscontainednegativefacialexpressions.Eachoftheeightpicturesofadog’s stimulussetwasshowntwiceineachsession. Everydogcompletedfivesessions(onesessionperweek)withtheexceptionofthreedogs forwhichwediscontinueddataacquisitionduetoalackofmotivation(forexampleifthedog didnotwanttogobacktothechinrestanymoreordidnotstayinthechinrestduringpicture presentation).Forthisreasondataacquisitionwithonedogwasstoppedaftertwosessionsand fortwootherdogsafterfoursessions.Theavailabledataofthesedogswasnevertheless includedintheanalyses(seedata-preparation). Dataanalysis Tobeincludedintheanalyses,datahadtofulfilfollowingcriteria:Trialswereonlyincludedin theanalysisifthedogfixatedatleastonceintotheface.Additionally,asessionwasonly retainedifthedoghadfixatedatleastonceonafaceofbothemotionalconditions(positive andnegative).Thedataofadogwerediscardedcompletelyifitdidnotincludeatleasttwo validsessions(forfurtherdetailsseeTable1). Eachfacewasdividedintofiveareasofinterest(AoI):eyes,forehead,mouth,facerest(nose, cheeksandchin)andpicturerest(hairandbackground,seeFig3).Sizeandpositioningofthe AoIswasbasedonthefollowingrules: 1. AoIforehead:Theforeheadareawasarectangulararearangingupwardsfromthetopofthe eyebrowstothehighestpointofthehairlineboundedbythefacecontourstotheleftand theright. 2. AoIeyes:Theeyearearangedfromthemiddleofthepupiltotheeyebrow.Thisareawas mirroreddownwardsandboundedbythefacecontourstotheleftandtheright. 3. AoImouth:Themouthregionrangedfromthemiddleofthemouthtothebaseofthenose. Thisareawasmirroreddownwardsandboundedbythefacecontourstotheleftandthe right. 4. Facerest:IncludedallpartsofthefaceareanotbelongingtoAoI(1)-(3). 5. PicturerestincludedallremainingpartsofthepicturethatdidnotbelongtoAoI(1)-(4) (e.g.blankparts,hair,shoulderetc.) PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 8/22 ProcessingofHumanEmotionalFacesbyDogs Fig3.Schemeshowingtheareasofinterest(AoI)forforehead,eyes,facerest,mouthandrestofthe picture. doi:10.1371/journal.pone.0152393.g003 FixationsintotheAoIs(1)-(4)aresummarizedas“in-face”fixations.Raweyemovement datawasanalysedusingDataViewer2.1.1(SRResearch,Ontario,Canada).Eyemovement eventswereidentifiedbytheEyeLinktracker´son-lineparser.Asthereistodatenovalidated literatureonthedefinitionofeyemovementeventsindogs,wewereworkingwithrawdata withoutanythresholdsforfixationduration.Fromtherawdataweextractedfixationdura- tions,numberoffixationsandlatenciestothefirstfixation(fixationstart).Notethatfixation durationswerenottrimmedanddurationscouldthereforeexceedthepresentationtimeofa trialifadogwasholdingafixationlongerthanthepicturepresentation.Onthebasisofthese datawecalculatedmeanvaluesforfixationdurationsandnumberoffixationsovereachtrial. Aswewereinterestedinfaceprocessing(ratherthanpictureprocessing)onlyfixationsintothe faceregionwereincludedintheanalysesunlessstateotherwise.Forthisreason,weextracted thefixationdurationofthefirstfixationdirectedintooneofthefourin-faceAoIs,thenumber offixationsdirectedintoin-faceAoIs,theidentityoftheAoIofthefirstin-facefixation,the PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 9/22 ProcessingofHumanEmotionalFacesbyDogs vector(angle)betweenthefirstandsecondfixation(gazedirection)andthelatencyofthefirst fixationintooneofthein-faceAoIs. StatisticalanalyseswereconductedwithSPSS22(IBMCorp.,Armonk,NewYork,United States)andR3.1.1[70].Linearmixedeffectmodels(LMM)wereusedtoanalysemeanfixation duration,fixationdurationofthefirstfixation,overallfixationcountpertrialandthelatency tothefirstfixation(time).AsfixationcountsintothefourdifferentAoIsdidnotfulfillthe assumptionsofaPoissondistribution(inparticularzeroswerestronglyoverrepresented),we usedzero-inflatednegativebinomialmodelstoanalyzethesedata(numberoffixationsintothe differentAoIs),usingtheR-packageglmmADMB0.8.0[71].Likewise,thefixationnumberof thefirstfixationintooneofthefourin-faceAoIswasanalysed(latencyoffirstfixationintoin- faceAoI,notethatthefirstfixationwastherebylabelledaszero).Generalizedlinearmixed models(GLMMs)wereusedtoanalyseifthefirstfixationwasdirectedintothefaceregion (SPSS,binomialGLMM)andwhichAoIwasfixatedfirst(firstAoI,SPSS,multinomial GLMM).UsingtheR-packagelme41.1.7[72],weanalysedifthefirstfixationwasdirectedto theleftorrighthalfoftheface(binomial,GLMM).Additionallywecalculatedtheproportion (cid:1) ofthefixationtimedwelledonthelefthalfoftheface((averagedurationleft 100)/(average durationleft-averagedurationright),negativebinomialGLMM). Wecouldnotfindanydifferencebetweentheemotionsangryandsadaswellashappyand neutral.Forthisreason,andtoincreasestatisticalpower,the“positive”emotionalexpressions, happyandneutral,andthe“negative”emotionalexpressions,angryandsad,werepooled.All linearmodelsincludedemotiontype(positivevs.negative),familiarity(familiarvs.unfamil- iar),dogtype(petvs.labdog)andwhereapplicableAoI(eye,forehead,mouth,facerest),as wellasall2-wayinteractionsbetweenthem,asfixedeffects.Where2-wayinteractionsturned outsignificant,thedatasetwassplittoexplorethenatureoftheinteraction.Dogtypeandses- sionnumbernestedwithindogtypewereincludedasrandomfactorsinallmodels.Addition- allywetestedifthetrialnumber(nestedwithinsession)andtheageofthedogcouldexplaina significantproportionofthevariationinthemodelsbyaddingthesevariablesasrandomfac- tors.Bothvariablesdidnothaveasignificantinfluenceonthedata(decisiononbasisofAkaike ´sinformationcriterion(AIC)).Wealsotestedforaneffectofbreedonthedata.Forthispur- pose,wecombineddataofdogsthatareknowntobelongtothegroupofherding(P1-5,P7-9, P14,P19),hunting(P11,P12,P18,P15,L1-8)orprotectiondogs(P5,P10,P13;categorization onbasisoftheiruseasworkingdog).Wecomparedthegroupoflaboratorybeagles(hunting dogs)tothethreebreedgroupsofthepetdogs.Wedidnotfindsignificanteffectconcerning thebreedgroup.Thisfactorwasdischargedfromfurtheranalysis.Modelreductionwasdone backwardsonthebasisofAkaike´sinformationcriterion(AIC).Predictorswereassumedtobe significantwithα(cid:3)0.05. Results Thefinaldatasetincludeddatafrom17petdogs(11maleand6female)withatotalof66ses- sionsand847trials(percentagetrialswithpositiveemotion48.4%,mean50trials/dog,ranging from14to76)and8labdogswithatotalof38sessionsand517trials(percentagetrialswith positiveemotion50.7%,mean65trials/dog,ranging51to74). FixationDuration Totalfixationtimepertrial. Therewasasignificantdifferencebetweenpetandlabdogs concerningthetotaltimetheyspentfixatingthepicture(LMM:F =7.27,p=0.014).The 1,19.6 labdogsspentmoretimefixatingthepicture(5545+/-98ms)thanthepetdog(4701+/- 72ms,seeFig4).Alsotherewasasignificantinteractionbetweendogtypeandemotiontype PLOSONE|DOI:10.1371/journal.pone.0152393 April13,2016 10/22
Description: