ORIGINALRESEARCHARTICLE published:19December2014 doi:10.3389/fpsyg.2014.01498 Cortical response of the ventral attention network to unattended angry facial expressions: an EEG source analysis study AlbertoInuggi1,FedericaSassi2, AlejandroCastillo2,GuillermoCampoy2,LetiziaLeocani3, JoséM.GarcíaSantos4 andLuisJ.Fuentes2* 1BasqueCenteronCognition,BrainandLanguage,SanSebastián,Spain 2DepartamentodePsicologíaBásicayMetodología,UniversityofMurcia,Murcia,Spain 3InstituteofExperimentalNeurology,L’IstitutodiRicoveroeCuraaCarattereScientificoSanRaffaele,Milan,Italy 4ServiciodeRadiología,HospitalMoralesMeseguer,Murcia,Spain Editedby: Introduction:Weusedanaffectiveprimetaskcomposedofemotional(happy,angry,and AlanJ.Pegna,GenevaUniversity neutral)primefacesandtargetwordswitheitherpositiveornegativevalence. Byasking Hospitals,Switzerland subjectstoattendtoeitherthefaces’emotionalexpressionortotheglasses’shape, we Reviewedby: assessedwhetherangryfacialexpressionswereprocessedwhentheywereunattended MarziaDelZotto,Universityof andtask-irrelevant. Geneva,Switzerland AngelaGosling,Bournemouth Methods:Weconductedadistributedsourceanalysisonthecorrespondingevent-related University,UK potentialsfocusedontheearlyactivityoffaceprocessingandattentionnetworks’related *Correspondence: areas.Wealsoevaluatedthemagnitudeoftheaffectiveprimingeffect. LuisJ.Fuentes,Departamentode PsicologíaBásicayMetodología, Results:We observed a reduction of occipitotemporal areas’ (BA37) activation to unat- UniversityofMurcia,Campus tended compared to attended faces and a modulation of primary visual areas’ activity Espinardo,30100Murcia,Spain e-mail:[email protected] lateralization.Thelatterwasmorerightlateralizedforattendedthanforunattendedfaces, and emotional faces were more right lateralized than neutral ones only in the former condition.Affectiveprimingdisappearedwhenemotionalexpressionsofprimefaceswere ignored.Moreover,anincreasedactivationintherighttemporo–parietaljunction(TPJ),but notintheintraparietalsulcus,wasobservedonlyforunattendedangryfacialexpressions at∼170msafterfacepresentation. Conclusion:Wesuggestthatattentionalresourcesaffecttheearlyprocessinginvisualand occipito-temporalareas,irrespectiveofthefaces’threateningcontent.Thedisappearance oftheaffectiveprimingeffectsuggeststhatwhensubjectswereaskedtofocusonglasses’ shape,attentionalresourceswerenotavailabletoprocessthefacialemotionalexpression, eventhoughemotion-relevantandemotion-irrelevantfeaturesofthefacewerepresented inthesameposition.Ontheotherhand,unattendedangryfacesevokedapre-attentiveTPJ activity,whichmostlikelyrepresentsabottom–uptriggerthatsignalstheirhighbehavioral relevance,althoughitisunrelatedtotaskdemands. Keywords: ventral attentional network, temporo–parietal junction, EEG source analysis, threatening facial expressions,attentionmodulation INTRODUCTION fearfulfacialexpression.Severalstudiesusingdifferentparadigms Emotionaleventsplayacrucialroleinhowhumansinteractwith haveshownthateventhoughtheemotionalcontentofthestimu- oneanotherandhowtheycanadapttochangingenvironments.To lusistask-irrelevant,itcapturesattentionandinterfereswiththe fostersurvival,itisessentialthatthreateningstimulithatoriginate relevant task (Okon-Singer etal.,2007; Hart etal.,2010), delays from other people or from the environment may be processed disengagement of attention (Georgiou etal., 2005), is detected in a rapid and efficient manner. Many pieces of evidence show more easily than a neutral stimulus (Hansen and Hansen,1988; thatthreateninginformationcanbeprocessedautomaticallyand Anderson, 2005; Calvo etal., 2006) and is better detected as a independentlyofattentionorattentionalresources(Stenbergetal., T2 in the attention blink paradigm compared with a neutral 1995; Vuilleumier etal., 2001; for reviews, see Compton, 2003; T2 (Anderson, 2005). Further evidence for the automatic pro- Vuilleumier, 2005). Moreover, this information processing can cessing of emotional expressions is derived from studies that occurevenwithoutconsciousperception(forarecentreview,see explicitly manipulated the focus of attention by asking subjects TamiettoanddeGelder,2010). to either attend to or ignore facial stimuli [e.g., Vuilleumier Onecommonstimulususedtodemonstratehowthreatening etal., 2001; Anderson etal., 2003; Eimer etal., 2003; see Eimer information can be prioritized and processed efficiently is the andHolmes,2007,forareviewof event-relatedpotential(ERP) www.frontiersin.org December2014|Volume5|Article1498|1 Inuggietal. Processingunattendedangryfacialexpressions studies]. For instance, Vuilleumier etal. (2001) presented two glassessothattheemotionalfacialexpressionwastask-irrelevant. facesandtwohousesarrangedparafoveallyintheverticalorhor- Inaddition,whereasmanystudieshaveinvestigatedtheprocess- izontal axis. Subjects had to compare the faces (faces-attended, ingof threateningstimuliusingfearfulfaces,wewereinterested houses-unattended)ortocomparethehouses(faces-unattended, inextendingouraffectiveprimingstudiestoothernegativeemo- houses-attended).Fearfulfaceswerecomparedwithneutralfaces. tionalexpression.Thus,angerfaceswereselectedforthepresent The activation in the amygdala, the hallmark of emotional pro- study.Angerisfrequentlyexhibitedindailylifeasmuchasother cessing,washigherwithfearfulthanwithneutralfaces.Notably, negativeexpressionssuchasfearandsadness,butfewstudieshave activationintheamygdaladidnotdifferwhethertheparticipants usedthisemotionalexpressioninparadigmsthatusedattentional paidattentiontothefacesortothehouses. manipulations. Recentstudies,however,havechallengedtheideathatthepro- Onthebasisof ourpreviousresults,weexpectedanaffective cessing of emotional information can occur without requiring primingeffectwiththeemotiontask,butnotwiththeshapetask. a sufficient amount of attentional resources (Pessoa etal.,2002, However, as Okon-Singer etal. (2007) pointed out, it is neces- 2005;Holmesetal.,2003;OchsnerandGross,2005;Okon-Singer sarytodissociateattention-dependentprocessingfromautomatic etal., 2007; Silvert etal., 2007; Sassi etal., 2014). For instance, processing (at least the“weak”notion of automaticity, Tzelgov, Pessoa etal. (2002; see also Pessoa etal.,2005) found emotion- 1997;Pessoa,2005).Despitethelackofbehavioralprimingeffects, related brain activity only when the subjects had to respond to whichmightdependontheavailabilityof attentionalresources, thegenderofthefaces(easytask),butnotwhentheyhadtodis- is still possible that processing of the negative facial expression criminate the orientation of two peripheral bars (difficult task). in the shape task occurs in a“strong”automatic way, indepen- Holmesetal.(2003)comparedERPsbetweenfearfulandneutral dentlyofbothattentionalresourcesandtaskrelevance.Negative facial expressions when the subjects had to compare two faces facialemotionalexpressionsmayberelatedtothreatandtherefore (face attended) versus two houses (face unattended), with both theymaybebehavioralrelevantstimulithatrequireafastauto- facesandhousesbeingsimultaneouslypresentedatdifferentspa- maticreactiontofostersurvival.Ifthatwerethecase,wewouldbe tiallocations.Differencesbetweenthetwoemotionalexpressions abletodetectemotion-relatedbrainactivationevenwhensubjects’ wereobservedonlywhenfaceswereattended.Inarecentbehav- top–downattentionisallocatedtoanemotion-irrelevantfeature ioral study, Sassi etal. (2014) used an affective priming task in of the face prime that requires fine-grained discrimination (the whichaprimefaceshowingeitheranemotional(positiveornega- shape task). The rational for that hypothesis is the existence of tive)oraneutralexpressionwasfollowedbyanemotionallyladen aneuralcircuitrycomprisingbothsubcorticalandcorticalareas, target word (positive or negative). In the critical trials the tar- thatisinvolvedintherapidandautomaticdetectionofthreaten- getwordcouldbeprecededbyafaceprimethatbelongedtothe ing salient stimuli, and that may play a crucial role for survival sameaffectivecategoryofthetarget(congruentcondition)orto (Vuilleumier,2005). a different affective category (incongruent condition). Affective Inthestudy,wecarriedoutdistributedsourceanalyses(Fuchs priming was measured through congruency effects, that is, the etal., 1999) over the ERP generated by the face. Unlike dipole differenceinperformancebetweenthecongruentconditionand analysis (Scherg and Von Cramon, 1985), which uses very few the incongruent condition. Sassi etal. (2014) observed affective sourcesandneedsstrongapriorihypothesesabouttheircharac- priming when the subjects’attention was allocated to the emo- teristics,thesourceanalysistechniquerepresentsthecorticalbrain tional information (emotion task), and also, albeit of a smaller activitythroughtheintensityofalargenumberofcorticalgener- size, when the emotion expression was made task-irrelevant by ators,providingamorerealisticsimulationofbrainfunctioning. asking subjects to determine whether the face wore glasses (the Amongtheseveralapproachesavailabletosolvetheinverseprob- glasses task). However, when the subjects were asked to deter- lem of reconstructing the cortical sources that generated the mine whether the glasses were rounded or squared (the shape recorded scalp potentials, we opted for a well-established post- task),theaffectiveprimingeffectvanished.Thisfindingwasprob- processing method (Inuggi etal., 2010, 2011a,b; Gonzalez-Rosa ablyaconsequenceofthefactthattheshapetask(difficulttask) etal.,2013).ItemploysasLORETA-weightedaccurateminimum requiredmoreattentionalmonitoringthantheglassestask(easy normmethod(SWARM)algorithm(Wagneretal.,2007),which task),andthereforetherewerenotsufficientattentionalresources allowsforthelowreconstructionerrorofsLoreta(Pascual-Marqui, as to process the emotional expression of the face prime (see 2002)andalsooutputsacurrentdensityvectorfieldthatcanlater also Okon-Singer etal., 2007, for similar evidence using a cog- bepost-processed. nitive load paradigm). A common feature of the studies that Wefocusedthenonthecorticalareasthatareinvolvedinthe report attentional modulation of emotional processing is that processingof thefine-grainedfacialfeatures. Briefly,theprocess the non-emotional task usually involves a high attentional load; of recognizing the static (identity, gender, familiarity) and the therefore, sufficient attentional resources were not available to dynamic(emotionalexpressionsandgazedirection)characteris- processtheemotionalcontentofthestimuli(Lavie,1995;Pessoa ticsof theobservedfacearethoughttorelymainlyonacortical etal.,2002,2005;Okon-Singeretal.,2007;PalermoandRhodes, stream(Haxbyetal.,2000; PalermoandRhodes,2007)embrac- 2007). ing both the classical ventral stream (Ishai etal., 1999) and the Thepresentstudyisafollow-upoftheSassietal.’s(2014)study, superiortemporalsulcus(STS).Theventralstreamoriginatesin althoughonlytwotaskswereused:theemotionaltask,inwhich theoccipitalareasandpropagatesthroughtheoccipitalfacearea subjects attended to the emotional expression of the face, and (OFA) and the fusiform face area (FFA). The FFA is specialized the shape task, in which subjects attended to the shape of the in decoding fine-grained static facial characteristics (Kanwisher FrontiersinPsychology|EmotionScience December2014|Volume5|Article1498|2 Inuggietal. Processingunattendedangryfacialexpressions etal., 1997; Halgren etal., 2000; Holmes etal., 2003; Bayle and MATERIALSANDMETHODS Taylor,2010),whiletheSTS,especiallyitsposteriorpart(pSTS),is SUBJECTS involvedintheprocessingofdynamicfacialfeatures,suchaseye Twenty-eight healthy, young (mean age 22.1 ± 2.3 years, range gaze,andindecodingtheemotionalinformationfromfacialfea- 19–30)subjectswithnohistoryofneurologicorneuropsychiatric tures(Puceetal.,1998;Allisonetal.,2000;HoffmanandHaxby, disorderswererecruitedtoparticipateinthisstudy.Fourteensub- 2000; Saidetal.,2010). Previousstudieshaveobservedthatthat jects(11femalesand3males)participatedineachtaskcondition FFA activates more with facial than with non-facial objects (see (emotion and shape). All subjects were right-handed according Haxbyetal.,2000,forreview),andthereforeweexpectedreduced to their self-report and gave their written informed consent for activation of that area in the shape task (focused on a non- participationinthestudy. face feature) compared to the emotion task (focused on a facial feature). TASK To model the activation of these areas, we performed both Subjects were tested individually in a sound-attenuated room. sourcesandsensorsanalysisincorrespondencetothemainERP A computer program generated by E-Prime 2 (Schneider etal., components.Besidesmodelingthetwomostlyinvestigatedearly 2002)controlledtheexperiment.Thestimuliwerepresentedona components, the posterior P1 and the lateral occipito-temporal 17(cid:3)(cid:3) TFT monitor (screen resolution: 1024 by 768 pixels; back- N170,wealsomodeledtheanteriorN1(Luoetal.,2010)andalater ground color: silver – RGB: 200, 200, 200) and participants positive component, peaking around 230–250 ms, whose name responded via the keyboard. We used three grayscale pictures andtemporallocationvarygreatlyacrossstudies(e.g.,VPPinLuo (4.5cmwideby7.7cmheight)ofhumanfacesasprimestimuli, etal.,2010;P270inLiuetal.,2012).BothP1andN1components oneforeachfacialexpression(happy,angry,andneutral).These havebeenassociatedwithafirststageofautomaticprocessingthat stimuli were taken from the NimStim Set of Facial Expressions differentiatesnegativefacialexpressionsfrompositiveorneutral (Tottenhametal.,2009;thereferencecodesoftheselectedfacesare facial expressions (Pourtois etal., 2004; Luo etal., 2010), which 20_M_HA_O,20_M_NE_C,and20_M_AN_O).Byusingphoto- reflects an early negativity bias (Smith etal., 2003). The N170 editing software, we created two versions of each picture, one componenthasbeeninvolvedinthedistinctionbetweenfacesand wearingroundedglasses,andtheotherwearingsquaredglasses. non-faces stimuli (Bentin etal., 1996; Rossion etal., 2003; Itier As target stimuli, we used 36 Spanish words divided into two andTaylor,2004;Luoetal.,2010).Astheaforementionedcompo- sets, one comprising 18 positive words, the other containing 18 nentshavebeenshowntobeaffectedbyaffectiveprocessinginan negative words. Mean valence ratings for the words of the two earlyphaseofperceptionandattentionprocessing(Carretiéetal., sets ranged from 1.7 to 2.8 (M = 2.3) for positive words and 2004;EimerandHolmes,2007;Luoetal.,2010),theyconstitute from –0.9 to –1.8 (M = –2.3) for negative words, according to themaingoalof ouranalysisof thefirst300mspostfaceprime a preliminary study (N = 124; scale ranging from –3 to +3; onset. seeSassietal.,2014). Positiveandnegativewordswerematched Sourceanalyseswerealsoemployedtoassesswhetherangryand forwordfrequency,familiarity,andwordlengthusingtheLEX- non-angry (happy and neutral) expressions were processed dif- ESP database (Sebastián-Gallés etal.,2000). Each trial consisted ferentlywhenattentionwasdirectedtoemotion-irrelevantfacial of the following sequence (the trial scheme is summarized in features. Specifically,becausenegativeemotionalexpressionsare Figure1). First,a1000-msfixationpoint(aplussign)appeared behavioral relevant stimuli, we expect activation in the ventral in the center of the screen followed by one of the three prime attentionnetwork(VAN),whichissupposedtodetectbehavioral faces, which was presented for 200 ms. Then, after an interval relevantbuttask-irrelevantstimuliandtoexertabottom–upmod- of 100ms(stimulusonsetasynchrony,SOA=300ms),atarget ulation over the dorsal attention network (DAN; Corbetta and word was shown (in capital letters and black font) and subjects Shulman,2002;Corbettaetal.,2008).However,becauseemotion- relevantandemotion-irrelevantfeatureswerefoveallypresented, wedidnotexpectanyreorientingprocessbytheDAN,whichis responsiblefortop–downcontrolasitcontains,specificallyinthe FrontalEyeFieldregion,thepropercircuitrytomovestheeyesto theselectedtarget.Thus,wemaybeabletotestthehypothesisthat theVANmightactivateindependentlyfromtheDANbyassess- ing brain activity in both the temporo–parietal junction (TPJ; VAN) and the intra-parietal sulcus (DAN). These networks are consideredsupramodal(Macalusoetal.,2002;Greenetal.,2011) andnotdirectlyrelatedtofaceprocessing.Becausetheirinvolve- mentinbottom–upandtop–downcontrolisderivedmainlyfrom functional magnetic resonance imaging (fMRI) studies, whose temporal resolution is not enough as to be coupled with elec- troencephalography (EEG) activation findings, their activation time course will be investigated here in the temporal proximity oftheclassicalERPpeaks,wherethefacefeaturedecodingprocess FIGURE1|Sequenceofeventsandtimedurationintheexperiment. isexpectedtooccur. www.frontiersin.org December2014|Volume5|Article1498|3 Inuggietal. Processingunattendedangryfacialexpressions indicated whether the word was positive or negative by press- Table1|Event-relatedpotentialcomponentsinvestigated,electrodes ing the “n” or “m” key on the computer keyboard as quickly containedintheeightclustersused,andthewindowofinterestused todefinethecomponent’speak. and accurately as possible (this first response is referred to as R1). Both prime faces and target words were presented cen- Components Clustername Electrodesin Windowof tered. The specific response-key mapping was counterbalanced cluster interest across participants. Immediately following R1, a double-choice question appeared on the screen, and subjects were prompted N1 L/RFrontal F3/4,FC3/FC4 80–130 to press, with no time limit, the key (“z” or “x”) that corre- P1 L/ROccipital PO7/8,PO3/4, sponded to the correct answer (hereafter, R2). In the emotion condition,subjectswereaskedwhethertheprimefacewasneutral O1/2,Oz,POz oremotional(theemotiontask),whereasintheglasses’shapecon- N170 L/ROccipito-temporal PO7/8,PO3/4, 130–190 ditiontheywereaskedwhetherthefaceworeroundedorsquared P7/8 glasses(theshapetask).Thewholeexperimentincluded72con- N170 L/RTemporo–parietal P5/6,CP5/6 gruent trials, 72 incongruent trials, and 144 neutral trials. In P240 Allthepreviouslydefinedclusters 220–260 congruent trials, the prime face and the target word belonged to the same affective valence, either positive, as it happened in happy-positive trials (N = 36) or negative, as it happened in the temporo–parietal region. These eight cluster measures were anger-negative trials (N = 36). In incongruent trials, a prime subjectedtostatisticalanalyses.Infurtheranalysis,thetwooccipi- face with different valence preceded the target word, as it hap- talclustersweremergedintoasinglecluster,anditsactivationwas penedinhappy-negativetrials(N =36)andanger-positivetrials expressedintermsofthelateralizationofitsmedial–lateralcenter (N=36).Inneutraltrials,aneutralprimefaceprecededthetarget ofgravity,calculatedwiththefollowingformula: word, as it happened in neutral-positive (N = 72) and neutral- negative(N =72)trials.Targetwordsweredrawnfromeachset COGx= (a∗PO8+b∗PO4+c∗O2−a∗PO7−b∗PO3 at random, with the constraint that each word appeared in two −c∗O1)/2∗(a+b+c) congruenttrials,intwoincongruenttrialsandinfourneutraltri- als.Ashortpracticeblockof18trialsprecededtheexperimental where a,b,c represent the medial–lateral coordinates of those trials. electrodesina10–20extendedsystem. EEGRECORDINGSANDPREPROCESSING SOURCEANALYSIS Electroencephalography was recorded using 59 scalp channels A preliminary ICA (Hyvarinen, 1999) was performed on ERP mountedontoanelasticcap(ActiveCap,BrainProductsGmbH), data,whichallowedforthedecompositionofthesignaltonoise- according to the 10–20 international system, with the refer- normalizedindependentcomponents(ICs). OnlythoseICsthat ence located close to the vertex. The EEG signal was amplified showedanSNRbelow1acrossallintervalsofinterest(from–250 (BrainAmp,BrainProductsGmbH),digitized(1000Hzsampling to 300 ms with respect to the facial onset) were removed from frequency),andfiltered(0.1to40Hz).Theelectrodeimpedance the ERP data (Inuggi etal., 2011a,b). The source activity was was kept below 5 K(cid:2). Four additional electrodes were placed reconstructed using the cortical current density (CCD) model to monitor the left/right and horizontal/vertical ocular activity. withaconductorvolumedefinedbya3-compartmentboundary Theeyemovements’artifactswerecorrectedwithanindependent elementmethod(BEM),withconductivityvaluesof0.33-0.0042- component analysis (ICA) Ocular Artifact Reduction algorithm 0.33S/m(Fuchsetal.,2002),derivedfromtheFSLMNItemplate (VisionAnalyzer,BrainProductsGmbH).TheERPswereobtained (www.fmrib.ox.ac.uk/fsl),dimensionsof91×109×91andavoxel byaveragingtheEEGepochsfrom–250to+300mswithrespectto size of 2×2×2 mm. The sources number (6899) and positions faceonset,usingthefirst200msforbaselinecorrection.Datawere wereobtainedbysamplingthecortex(5mmwide),withtheirori- finallyre-referencedusingacommonaveragereferenceapproach. entationsfixedperpendiculartothecorticalpatchtheyoriginated from,andtheirintensitieswerecalculatedusingtheSWARMalgo- ERPCOMPONENTSDEFINITION rithm(Wagneretal.,2007).TheCCDwasreconstructedwiththe According to previous studies, we focused on the P1 and N170 CurryV6software(NeuroscanInc.,Herndon,VA,USA). components and also on N1, which peaks in frontal regions at ∼100 ms. Additionally, our data revealed a late positive deflec- ROIdefinition tion, peakingat∼240ms, thatwasalsoinvestigated. Fourpairs Cortical activity was calculated in seven pairs of right and left of sensorsclusters,whoseamplitudewascalculatedasthemean regionsofinterest(ROI)involvinglateralfusiformgyrus(BA37), amplitudeoftheirconstituentsensors,weredefinedtomodelthe posteriorsuperiortemporalsulcus(pSTS),TPJplusinferiorpari- ERPcomponents.Ineachsubjectandforeachexperimentalcon- etallobule(TPJ+IPL),intraparietalsulcus(IPS),middlefrontal dition,theamplitudesofcomponents’peakswerecalculatedasthe gyrus(MFC),inferiorfrontalgyrus(IFG),andprimaryvisualarea maximumpositive/negativedeflectionswithinthetimewindows (V1).Inanadditionalanalysis,thetwoV1ROIsweremergedinto specified in Table1. To better compare ERP results with source a single ROI,and its activation was expressed in terms of later- analysisresults,afurthercluster,conventionallynotinvestigated alization of its medial–lateral center of gravity, calculated as is inpreviousstudies,wasdefinedfortheN170periodthatcovered explainedlateron. FrontiersinPsychology|EmotionScience December2014|Volume5|Article1498|4 Inuggietal. Processingunattendedangryfacialexpressions Regionsof interestweremanuallydrawnontheMRIimages STATISTICALANALYSIS using the Curry software internal anatomical atlas and previous The effects of the between-subjects factor task type (emotion research as references. TPJ+IPL ROI was created starting with task, shape task) and the within-subjects factor face expression the strict TPJ definition of Mort etal. (2003) but also included (angry, happy, or neutral) and hemisphere (left and right) over the inferior parietal lobe, like most studies that investigate the TA within each area and period were analyzed with a mixed VAN and that located their activations around these areas. Its analysisofvariance(ANOVA).TheKolmogorov–Smirnovtestwas resultingcenterofgravitywillclarifymorespecificallytheanatom- usedtoexaminethenormaldistributionof thedata,and,when ical localization of this activation. To take into account possible appropriate,theGreenhouse–Geissercorrectionwasapplied.The between-subjects electrodes’slight montage misallocation, ROIs significancelevelofthemaineffects(tasktype,emotionalexpres- were enlarged (5 mm wide) and then smoothed (2 mm wide). sion, and hemisphere) and their interactions were corrected for ROIsareillustratedinFigure2. multiplecomparisons(14ROI×3periods)usingafalsediscovery rate(FDR)approach,butusingamoreconservativeversion(Ben- ROIactivity jaminiandYekutieli,2001)comparedtostandardFDR.According Threeperiodswereinvestigated:L100,whereN1andP1areactive, toitsformula(α/(cid:3)i=1..k(1/i),wherei=42isthenumberofmul- L170,whichcorrespondstotheN170peak,andL240,whichcor- tiple comparisons and α = 0.05 is the predetermined p-value), respondstoourlatepeak.Withintheseperiods,themeancortical wereportonlythesignificantp-valuesbelow0.0112.Becausethe activationofeachROIwasseparatelycalculatedusingthefollow- numberofmultiplecomparisonswaslowerintheERPanalysis(8 ingprocedure: (i)withineachlatency, theintensityof allof the cluster×3periods),thecorrectedthresholdwas0.0132.Thesize activesourcescontainedintheROIweresummed;(ii)thelatency effectswerereportedthroughtheη2 value.Posthoccomparisons p withthehighestvaluewasdefinedasthepeaklatency(PL);and(iii) ofwithin-subjects(facialexpression)andbetween-subjects(task a40ms-lengthtemporalwindow,centeredonthatpeak,wasused type)factorswereperformedwithpairedandunpairedt-tests.The tocalculatetheareatotalactivity(TA)withineachperiodofeach multiplepairwisecomparisonsoffacialexpressionswereadjusted area,aswaspreviouslydescribed(Inuggietal.,2011a;Gonzalez- withtheBonferronicorrection. Rosa etal., 2013). This procedure was performed separately for To provide the ERP equivalent of our source analysis results, eachROI,thusallowingustotakeintoaccounttheonsetdiffer- a mixed ANOVA, analyzing the effects of task type and face encesofnearlysimultaneouscomponents(e.g.,P1andN1)andto expression, was also performed over the ERP electrode clusters createperiodsofthesametemporallengthtoensurepropercom- that overlay the ROI of the sources significantly affected by our parisons. Thelengthof thetimewindowwasselectedaccording experimentalfactors. toapreviousstudy(Gonzalez-Rosaetal.,2013). Theactivationsofcenterofgravity,decomposedinthemedial– RESULTS lateral (CX), anterior-posterior (CY), and ventral-dorsal (CZ) BEHAVIORALDATA positions,werecalculatedusingthefollowingformula(e.g.,CX): Trialswithincorrectresponsestothetargetword(R1;1.8and1.5% fortheemotiontaskandtheshapetask,respectively),andtrials CX = ((cid:3) s ∗X )/(cid:3) s , withincorrectresponsestotheto-be-attendedfacialfeature(R2; ij ij ij ij ij 3.1and5.0%fortheemotiontaskandtheshapetask,respectively) wereexcludedfromanalysis.Inaddition,weexcludedtrialswith wheres istheintensityofthei-thsourceattimepointjandX is ij ij RTs below 200 ms (anticipations) or more than three standard themedial–lateralpositionofthei-thsourceattimepointj. deviation(omissions)fromthesubject’smeanforeachcondition (1,90%). The mean RT for R1 in the emotion task was 790 ms (SD = 144) for congruent trials (happy face/positive word and angryface/negativewordtrials)and825ms(SD=164)forincon- gruent trials (angry face/positive word and happy face/negative wordtrials).Intheshapetask,themeanRTwas767ms(SD=173) for congruent trials and 768 ms (SD = 155) for incongruent trials. These means were submitted to mixedANOVA withcon- gruency(congruent,incongruent)andtasktype(emotion,shape) asfactors.Therewasamaineffectofcongruency,F(1,26)=9.75; MSE =464;p=0.004;η2 =0.27,revealingthatresponseswere p faster for congruent than for incongruent trials (this difference represents the affective priming effect, M = 18 ms). However, thiseffectwasqualifiedbyacongruency bytasktype interaction, F(1,26)=9.10,MSE=464,p=0.006,η2=0.26.PosthocFisher’s p leastsignificantdifference(LSD)tests(MSE=25411,df=26,479) FIGURE2|Corticalareasinvestigated.IPS,intraparietalsulcus;pSTC, posteriorsuperiortemporalsulcus;TPJ,temporo–parietaljunction;BA37, revealedsignificantcongruencyeffectfortheemotiontask(prim- lateraltemporo–occipitalcortex;MFC,middlefrontalcortex;IFG,inferior ingeffect=35ms, p <0.001)butnoeffectatallfortheshape frontalgyrus.Withindottedlines,isrepresentedthatpartofTPJwhich task(primingeffect=0.6ms,p=0.941).Results,thus,replicate overlapswithIPSandpSTS. thoseobtainedinourpreviousbehavioralstudy(Sassietal.,2014). www.frontiersin.org December2014|Volume5|Article1498|5 Inuggietal. Processingunattendedangryfacialexpressions Tosupportthegoodnessofourprotocol,weverifiedthatneither Effectoftasktype themaineffectsof wordvalence,F(1,26)=1.56,p=0.222,and DuringL170,aneffectoftasktypewasobservedinlateralBA37 tasktype,F <1,northeirinteraction,F(1,26)=1.30,p=0.26, activity, [F(1,26)=7.93; p =0.011, η2 =0.26], whichwasless p werestatisticallysignificantwiththeneutralexpression.Analysis intense(Figure4)intheshapetask(M=2.28,SD=0.6μA/mm2) oferrorrate(CR1)revealednostatisticallysignificanteffects. thanintheemotiontask(M=5.4,SD=0.5μA/mm2). SOURCEANALYSISDATA Interactionbetweentasktypeandfacialexpressions The group averages of the evoked potentials elicited by the two Asignificanttypetaskxfacialexpression×hemisphereinteraction tasks, which merged the three emotional faces, are displayed in was observed in IPL+TPJ during L170, [F(1.518,39.45) = 6.41, Figure 3. Table 2 summarizes the activation’s center of gravity p = 0.010, η2 = 0.23]. Post hoc analyses revealed that the type p coordinatesandPLvaluesoftheROIs,whereasignificanteffect task × facial expression was significant only for the right side, ofeithertasktypeorfaceemotioncouldbeobserved. [F(1.81,37.06)=5.35,p=0.010,η2=0.218].Additionally,while p facialexpressionsdidnotdifferfromeachotherintheemotion task,aneffectoffacialexpressionwasobservedintheshapetask Table2|Talairachcoordinatesofactivations;center-of-gravityinright whenfacialexpressionshadtobeignored,[F(1.58,20.56)=12.06, IPL+TPJROIatL170. p=0.001,η2=0.48],withhigheractivationtoangryfacialexpres- p sions (M = 8.1, SD = 1.1 μA/mm2) compared to both happy Area Task Neutral Happy Angry (M =5.8,SD=0.9μA/mm2,p=0.002)andneutral(M =6.1, X Y Z X Y Z X Y Z SD=0.8μA/mm2,p=0.002)ones(Figure4,right; Figure5). Thecenterof gravitypositionof corticalactivationinIPL+TPJ RTPJ170 Emotion 49 –51 24 48 –49 23 49 –49 22 ROI,reported in Table 2, was located in close proximity to the Shape 49 –51 29 48 –53 28 49 –50 23 TPJdefinedbyMortetal.(2003),asshowninFigure5.Wethus will refer to it as TPJ activation. No modulation over the IPS, FIGURE3|GroupaveragesofERPinemotion(solidline)andshape(dottedline)tasksinthefirst300msafterfacialstimuluspresentation.Forallthe electrodes,theverticalscaleboundaryissetat+10μV. FrontiersinPsychology|EmotionScience December2014|Volume5|Article1498|6 Inuggietal. Processingunattendedangryfacialexpressions FIGURE4|EffectofemotionalexpressionandtasktypeatL170.(Left)TasktypeeffectsonrightlateralBA37.Nosignificantdifferenceswereobservedfor facialemotion.(Right)RightTPJsensitivitiestoangryfacialexpressioninshapetaskonly.Onthey-axis,themeanactivityofeachROIintheL170time windowisexpressedinμA/mm2. FIGURE5|Theshapetask:increasedactivationinresponsetotheangryfacialexpression(right)comparedtohappy(center)andneutral(left) expressionsintheTPJwithintheIPL+TPJROI(voxelsenclosedwithintheyellowborders)atL170. pSTS,or middle and inferior frontal areas were observed at any clusterswasmodulatedbytasktype[F(1,26)=5.45,p=0.011, latency. η2 =0.25],whichwasmoreright-lateralizedintheemotiontask p (M = 11.3, SD = 4.5 mm), than in the shape task (M = –0.9, Lateralizationofvisualareaactivity SD = 3.8 mm). At ∼170 ms, the occipito-temporal cluster that DuringtheP100component,themedial–lateralcenterofgravity overlays the lateral BA37 was not affected by the task type. In (CX)of thevisualareaswasmorelateralizedtotherighthemi- therightoccipito-temporalcluster,whichshouldprovidetheERP sphereintheemotion(M=–5,SD=0.9mm)taskcomparedto equivalentoftherightTPJactivation,asignificantinteractionwas theshape(M=8.7,SD=2.1mm)task[F(1,26)=8.21,p=0.010, found between task type and facial expression in the occipito- η2 =0.281;Figure6].Asignificanttasktype×facialexpression temporal cluster [F(1.52,21.13) = 5.20, p = 0.012, η2 = 0.24]. p p interaction was observed in visual areas, [F(1.53,38.21) = 6.55, Nevertheless,wefoundatrend(p=0.065)versusamorenegative p=0.010,η2 =0.18].Theeffectoffacialemotionontheactiva- peak to angry faces compared to neutral ones in the shape task p tionlateralizationwasobservedonlyintheemotiontask,withthe (Figure7). Nodifferencesemergedwithintheparieto-temporal angry(M =12,SD=2.5mm,p=0.011)andhappy(M =10, cluster. SD=2.2mm,p=0.010)facesmorelateralizedtotherighthemi- spherewithrespecttotheneutralfaces(M =4.7,SD=2mm). DISCUSSION NosignificantdifferencesemergedintheL240interval. In this study, the effect of a fine-grained, emotion-irrelevant, discriminatory task on the early emotional faces processing was ERPDATA investigatedbyreconstructingthecorticalgeneratorsofthescalp- DuringtheP100component,themedial–lateralcenterofgravity recordedpotentials. Ourmainobjectivewastoevaluateif angry of the cluster obtained by merging the right and left occipital expressions were processed differently from non-angry (neutral www.frontiersin.org December2014|Volume5|Article1498|7 Inuggietal. Processingunattendedangryfacialexpressions FIGURE6|VisualarealateralizationaroundL100.(Left)Theeffectoffacial emotionalfacialexpression(center)comparedtotheneutralfacialexpression expressionandtasktypeonthemedial–lateralpositionoftheactivation’s (right)intheemotiontask.Onthey-axis,themeanactivityoftheROIinthe center-of-gravity(COG).Thecorticalcurrentdensity(CCD)resultsofthe L100timewindowisexpressedinμA/mm2. FIGURE7|ERPresults:(upperrow)effectoftasktypeoveroccipito-parietalcluster;(lowerrow)effectoffacialexpressionoveroccipito-parietal clusterinemotion(left)andshape(right)tasks. FrontiersinPsychology|EmotionScience December2014|Volume5|Article1498|8 Inuggietal. Processingunattendedangryfacialexpressions andpositive)expressionswhenattentionwasdivertedtoanother results strongly suggest that our shape task succeeded in guid- task.Weoptedtoengagesubjectsinafinediscriminationof the ing subjects’ attention away from any face feature, preventing shape of the glasses worn by the face stimuli, a task that was any conscious monitoring of the emotional content of the face. supposedtodeplete,accordingtoourpreviousbehavioralstudy, In the long debate over the pre-attentive automaticity of emo- theattentionalresources(seeSassietal.,2014). Severalprevious tionalprocessing,ourresultssuggestthatanappropriatelevelof studies assessed the interaction of attention and emotion when attention is needed to process emotional expressions. Although emotion-relevant and emotion-irrelevant stimuli did not share presentedinthesamevisualfocus,thereducedBA37activityand thesamegeometricalspace(Vuilleumieretal.,2001;Holmesetal., the loss of emotional selectivity of primary visual areas in the 2003).Becausetheredirectionofthesubject’sattentiontoanother shapetasksuggestthatsubjectspresumablyfocusedtheirattention positionmayrepresentapotentialconfoundingissue,weoptedto justontheglasses’shapeandignoredtheunderlyingemotional placeboththeemotion-relevantandemotion-irrelevantfeatures expression. inthesamefovealposition,removinganyobstaclestotheauto- Thelateralizationoftheactivationsfoundinthepresentstudy maticprocessingof emotionalfaceswhenaskedtoignorethem. deservesfurthercomments.Thelateralizationofemotionalpro- In addition to investigating the peculiar processing of ignored cessing is still an open issue because the two main theories, angryfaces,wewerealsointerestedingivinganeurophysiological supportingeithertheright-hemispherehypothesis(RHH;Borod explanation for the loss of the affective priming effect observed etal.,1998;Bourne,2010)orthevalence-specifichypothesis(VSH; in our behavioral results (Sassi etal.,2014, current study) when Mandaletal.,1991;Adolphsetal.,2001),havebeenquestionedby subjectswereinvolvedinanemotion-irrelevanttask.Weconcen- more recent fMRI meta-analysis investigations (Fusar-Poli etal., tratedouranalysisonthecorticalareasinvolvedintheprocessing 2009;Sabatinellietal.,2011).Thebulkofevidenceshowsbilateral ofthefine-grainedfacialfeatures,whicharesupposedtobehighly activationforemotionalfaceprocessinginmostemotion-related modulated in a top–down manner by the observers’ attention, areas,althoughlateralizationmightbemodulatedbygender(see makingitsprocessingnotpre-attentivebutstrictlyrelatedtothe forexampleWageretal.,2003).Inthepresentstudymost(22out availability of attentional resources. Moreover, considering the of 28) of the subjects were women, and our data are consistent highpriorityofaversivefacialexpressionsincapturingattentional with a previous EEG report specifically investigating the gender resources, we also focused on the parietal areas that belong to effect over emotional face processing. Proverbio etal. (2006) in both the ventral (TPJ) and the dorsal (IPS) attention networks factfoundmaximalP1amplitudeovertherightoccipitalcortex and the partially overlapped frontal areas of the two networks, inbothgenders,consistentlywithourresultsshowingthatinthe the inferior (IFG) and middle (MFG) frontal gyri (Fox etal., emotion task occipital activity around 100 ms was right lateral- 2006). ized.Thelackofarightlateralizationobservedinourdataduring theN170mayappearinconsistentwiththewidelyacceptedright THEEFFECTOFATTENTIONONTHEVENTRALSTREAM predominance of FFA in face processing (Kanwisher and Yovel, Inthepresentstudy,weconfirmthattheventralstreamishighly 2006). However, this again agrees with Proverbio etal.’s (2006) modulatedbytheobserver’sattention.Theactivityoftheoccipital findingsofarightlateralizationofN170onlyinmen.Incontrast, areasat∼100mswasmorerightlateralizedintheemotiontask women exhibited a bilateral pattern. These results can help fos- thanintheshapetaskand,morenotably,whensubjectsattended terbetterunderstandingoftheinconsistenciesintheliteratureon to the facial expression, activation produced by emotional face therighthemisphereadvantageintheoccipito-temporalcortices expressionswasmorerightlateralizedthanactivationproducedby whenprocessingfacesandconfirmtherelevanceofincorporating neutralfaces.Suchselectivitydisappearedwhensubjectsattended genderinformation. totheglasses’shape. ConsideringthattheassessmentofFFAactivitythroughscalp ANGRYFACIALEXPRESSIONPROCESSING recordings is widely questioned, as the area lies within the infe- Although both static and emotional features appeared under- rior part of the temporal cortex, we created the lateral BA37 processedbycanonicalfaceprocessingcorticalareas,unattended ROI because previous neuroimaging studies showed a correla- angryexpressionswereabletoactivatetheTPJ,acorticalexpanse tionbetweentheN170EEGcomponent,calculatedbyelectrodes implicated in a wide spectrum of high-order cognitive func- overlaying it and fMRI-derived FFA activity (Horovitz etal., tionsrangingfromsocialcognition(SaxeandKanwisher,2003) 2004; Sadeh etal., 2010), which suggests that surface electrodes to attention selection (Corbetta and Shulman,2002). The latter may capture at least part of FFA activity. Additionally, electro- branchofinvestigationshowedthattheTPJispartoftheVAN,a corticography studies have revealed that lateral BA37 is also fronto-parietalnetworkthat,duringfocusedactivities,isformally involved in face processing (Rossion etal., 2003; Tsuchiya etal., involvedinre-orienting(shifting)attentiontostimulirelevantto 2008). At ∼170 ms, lateral BA37 activation was reduced in the the immediate goal. Nevertheless, because the attentional focus shape task compared with the emotional task, which suggests covered a similar area in both tasks, no reorienting process was that when subjects were asked to ignore the facial expression expected,asourIPSactivityalsoindicates.Thelatterisinfactpart and just concentrate on the glasses’shape, the detailed face fea- of the DAN, which contains the proper circuitry to implement tures might not have been very distinctive. This result agrees the focus reorienting, and was not modulated by our experi- withpreviousfindingsthatreportlargeractivityinFFAforfaces mental conditions. The absence of any modulation over frontal comparedtonon-faceobjects(Haxbyetal.,2000; Rossionetal., areas might be interpreted accordingly; the integration between 2003). Taken together, our behavioral and neurophysiological ventral and DANs, needed for attention re-orienting, occurs in www.frontiersin.org December2014|Volume5|Article1498|9 Inuggietal. Processingunattendedangryfacialexpressions suchaforementionedfrontalareaswherethetwonetworkshighly thatsubjectsfocusonwordonsetandthecorrespondingresponse overlap(Foxetal.,2006). relatedtoitsemotionalvalence. Thus, the present findings support the proposal that VAN activation, at least in its parietal areas, might not exclusively be DIFFERENCESBETWEENSOURCESANDSENSORSANALYSIS involved in attentional reorienting. It is consistent with more Inthepresentpaper,weaimedtoprovideanERP-equivalentof recentreportsthatsuggestthatTPJactivitymightbetriggeredby theactivationsproducedbysourceanalysis.Wethusfocusedthis bothexternalsensorystimuliandinternalmemory-basedinfor- analysisonlyonthetimewindowsandclustersthatsurroundthe mation,thusprovidingbottom–upsignalstoothersystemsabout corticalareasaffectedbyourexperimentalconditions.ERPanaly- relevant stimuli for further inspection (Cabeza etal., 2012). In sisfoundthatattendedemotions,comparedtoignoredemotions, agreement with the present results, VAN activity has also been havetheiroccipitalP1peakmorerightlateralizedbutwasunable observed when behaviorally relevant, rather than salient, stim- toassesstheselectivitytowardattendedemotionalfaces,whichdis- uliarepresentedwhiletheindividualisengagedinanothertask appearedintheshapetask.Inasimilarmanner,ERPanalysiscould (Corbettaetal.,2008). Accordingly, the activation of TPJ just detecttheinteractionbetweentaskandemotionsat∼170msin when the unattended face was shown with an angry expression therightoccipito-temporalcluster,butitdidnotfindasignificant suggeststhatnegativeemotionscanpre-attentivelyevokebottom– differencebetweenangryandnon-angryignoredfaces.Ofcourse, upcorticalsignals,accordingtotheirbehavioralrelevance,even thecurrentERPapproachisonlyoneofmanypossibleapproaches. when attention is focused on emotion-irrelevant features in a WearenotconcludingthatanotherERPanalysiswouldhavebeen taskthatweassumedexhaustedtheattentionalresourcestopro- unabletolocatethesameeffectsfoundwithsourceanalysis.How- cess the emotional content of faces. Because the ventral stream ever,evenif suchaneffecthadbeenencounteredinaclusteror andSTSwerenotmodulatedbythedegreeof unattendedemo- in a channel (e.g., CP4 or CP6), it would have been impossible tionalcontentandtheVANisconsideredasupramodalnetwork toclearlyattributeittooneoftheareasbeneathandclosetothe (Macalusoetal.,2002; Greenetal.,2011)notabletodecodethe sensors cluster. Ideally, both pSTS and BA37 would have been threatening pattern from facial expression, we suggest that TPJ validcandidates,andwecouldhavearguedthatbecausetheyare activation might be triggered from other brain regions. Several partofthecorticalstreamsupposedlydeputedtoextractfacefea- neuroimaging studies suggested that, in parallel with the corti- tures,theywouldhavepresumablyshownsuchfunctioningalsoin cal stream (Palermo and Rhodes, 2007), a subcortical pathway, theattendedcondition,butthatdoubtwouldhavepersisted,and that reaches the amygdala through fast and coarse subcortical theinvolvementof TPJcouldhavebeenjustoneof thepossible inputsthatoriginateinthesuperiorcolliculusandfinallyproject hypotheses. Instead,sourceanalysis,whencalculatingthecenter onto fronto-parietal areas, is thought to implement a brain cir- ofgravityofthelargeROIcoveringthetemporalandparietallobe, cuitryspecializedinemotionalattention(Vuilleumier,2005).This indicatedtheTPJinvolvement. circuitry,likelypartlymodulatedbytheattentionalfocus(Pour- toisetal.,2013)isinvolvedintherapidandautomaticdetection METHODOLOGICALCONSIDERATIONSANDLIMITSOFTHEPRESENT of negative facial expressions (for review; see Vuilleumier and INVESTIGATION Pourtois, 2007), and it seems to play a crucial role in direct- The main limits of EEG source analysis are its high sensitivity ing attention and information processing to threatening stimuli to artifacts, the low signal-to-noise ratio and the limited spatial (Ohman and Mineka, 2001). Because reconstructing amygdala resolution.Toproperlyaddresstheselimits,weemployedacon- activitywithEEGpresentsseveralaccuracylimitations,asitwillbe solidated methodological approach (Inuggi etal.,2010,2011a,b; discussedlater,furtherstudiesthatintegrateEEGwithneuroimag- Gonzalez-Rosa etal., 2013), which has consistently proved to ingtechniquesaresurelyneeded,butourdataareconsistentwith obtain results in line with the neuroimaging literature. We used such a model. A previous MEG study showed in fact that the a seed-based analysis instead of a voxel-wise one because this amygdalaactivatesasearlyas100msafterstimuluspresentation approachisoftenusedinbothEEGandneuroimaginganalyses, (Streitetal.,2003), alatencyearlyenoughtotriggerTPJactiva- whenstronghypothesisoftheinvolvedbrainareasispossible.In tionat∼150–170ms. ThepresentTPJactivationof ∼170msis fact, although the experimental task, seen as a whole, is brand consistent with a recent ERP study that investigates the threat new, the areas involved in the investigated interval have been detection advantage (Feldmann-Wüstefeld etal., 2011), which accurately described in the past as producing a consistent pic- revealed that angry and happy expression processing started to turethatguidedandsupportedourROIselection.Weadopteda differ at ∼160 ms. This suggests that angry faces may trigger a conservativeapproach,selectingROIsinareasontheoutersur- fearmodulethatenablestheirrapidprocessingandrecruitaddi- faceof thebrainwherethespatialresolutionof theEEGsource tionalattentionalresources,possiblybymeansof TPJ,asishere analysisismaximalandavoidingtheinvestigationof deepbrain hypothesized. areas such as the proper FFA, orbitofrontal, para-hippocampal In conclusion, within the VAN, TPJ activation at this early corticesandamygdala.Theseareaswerereportedinseveralneu- latency primarily signals the behavioral relevance of a task- roimagingstudiesbuttheirreconstructionthroughEEGpresents irrelevantaversivestimulus,irrespectiveofwhetherthatstimulus severalmethodologicalissues.EEGsourceanalysisaccuracyisin requires a physical shift of attention (involving the dorsal net- facthighlycorruptedbythehugeanisotropyandinhomogeneity work). The fact that such a trigger was not followed by an of the brain that blur the emerging signal when it is not mod- actual over-processing of face features is likely due to the task eled by a proper volume conductor model. Deep sources are of demandsthat,immediatelyafterfaceoffset(∼200ms),required coursemoreburiedwithinthebrainastheideallinesseparating FrontiersinPsychology|EmotionScience December2014|Volume5|Article1498|10
Description: