802
Views
77
CrossRef citations to date
0
Altmetric
Original Articles

Facilitated detection of angry faces: Initial orienting and processing efficiency

, &
Pages 785-811 | Received 05 Dec 2003, Published online: 03 Feb 2007
 

Abstract

In a visual search task, displays of four schematic faces (angry, sad, happy, or neutral) were presented. Participants decided whether the faces were all the same or whether one was different. A discrepant angry face in a context of three neutral faces was detected faster than other faces. This occurred in the absence of a higher probability of first fixation on the angry face. Moreover, angry faces were more accurately detected even when presented parafoveally. These results are not consistent with the hypothesis that angry faces are detected faster because they are looked at earlier. In contrast, angry faces were looked at less than other faces during the search process and were more accurately detected than other faces when the display duration was reduced to 150 ms. These results support the processing efficiency hypothesis, according to which fewer attentional resources are needed to identify angry faces, which would account for their speeded detection time. In addition, parafoveal analysis of the angry faces may start preattentively, which would account for the fewer and shorter fixations that they need later, when they come under the focus of attention.

Acknowledgments

This research was supported by Grant BSO2001-3753, from the DGI, Spanish Ministry of Science and Technology. We are grateful to Francisco Esteves and Margaret Dowens for their helpful comments on an earlier version of this article.

Notes

1We used schematic faces rather than photographs of real faces as stimuli. Schematic faces or face-like drawings have also been used in other studies (Bentin et al., Citation2002; Eastwood et al., Citation2001 Citation2003; Eger et al., Citation2003; Fenske & Eastwood, Citation2003; Fox et al., Citation2000; Lundqvist et al., Citation1999; Nothdurft, Citation1993; Öhman et al., Citation2001; Tipples et al., Citation2002; White, Citation1995), on the assumption that schematic faces represent prototypes of facial expressions and that relevant findings can be obtained with these about the processing of real emotional faces (see the general discussion). As the aim of this study was to account for the visual search advantage in detecting angry faces, which is an effect that has been found in studies using schematic faces (except for Hansen & Hansen, Citation1988, who used real faces), we also used schematic faces, to warrant comparability.

2Nevertheless, the viewers seemed to scan the array automatically and very rapidly, with very short fixations (between 160 ms and 200 ms), which did not discriminate between stimuli, and so we dropped this measure and replaced it with a new approach in Experiment 3.

3It might be argued that first fixation is an insensitive measure of attentional orienting and that latency of the first fixation would be a better measure to assess the attentional orienting hypothesis. Against this assumption, prior research using high precision eye-trackers has shown that the latency of first fixation is not affected by the emotional expression of face (including neutral and angry, sad, and happy faces; Bradley et al., Citation2000; Mogg et al., Citation2000) and the emotional content of photographs of real scenes (Nummenmaa et al., in press). Latency is probably controlled by a purely oculomotor automatic mechanism that is not influenced by stimulus meaning but only by physical properties of the stimulus (e.g., luminance).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.