638
Views
12
CrossRef citations to date
0
Altmetric
BRIEF REPORTS

Processing emotional category congruency between emotional facial expressions and emotional words

, &
Pages 369-379 | Received 08 Jan 2010, Accepted 07 Apr 2010, Published online: 21 Oct 2010
 

Abstract

Facial expressions are critical for effective social communication, and as such may be processed by the visual system even when it might be advantageous to ignore them. Previous research has shown that categorising emotional words was impaired when faces of a conflicting valence were simultaneously presented. In the present study, we examined whether emotional word categorisation would also be impaired when faces of the same (negative) valence but different emotional category (either angry, sad or fearful) were simultaneously presented. Behavioural results provided evidence for involuntary processing of basic emotional facial expression category, with slower word categorisation when the face and word categories were incongruent (e.g., angry word and sad face) than congruent (e.g., angry word and angry face). Event-related potentials (ERPs) time-locked to the presentation of the word–face pairs also revealed that emotional category congruency effects were evident from approximately 170 ms after stimulus onset.

Acknowledgements

For details regarding the NimStim facial expressions see http://macbrain.org/resources.htm. Development of the MacBrain Face Stimulus Set was overseen by Nim Tottenham and supported by the John D. and Catherine T. MacArthur Foundation Research Network on Early Experience and Brain Development. Please contact Nim Tottenham at [email protected] for more information concerning the stimulus set.

The authors would like to thank Prof. Gill Rhodes and Prof. Colin MacLeod for their assistance throughout the ERP component of the project, which was completed at the University of Western Australia, and Megan Willis for her involvement in the preparation of the stimuli and in the processing of the ERP data. We are also grateful to Dr Genevieve McArthur, Dr Matthew Finkbeiner, Prof. Gerhard Stemmler and two anonymous reviewers for helpful comments on earlier versions of this manuscript.

Notes

1The VPP appears to be an analogue of the N170, though the presence of effects seems to depend upon the referencing method used. The VPP occurs within the same time window as the N170 but at central midline rather than occipito-temporal sites and also shows “face-sensitive” brain responses (Jeffreys, Citation1989) and emotional amplitude modulation (e.g., Sewell, Palermo, Atkinson, & McArthur, 2008).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.