291
Views
2
CrossRef citations to date
0
Altmetric
Original Articles

The capture of attention and gaze in the search for emotional photographic faces

, , &
Pages 241-261 | Received 22 Dec 2016, Accepted 15 May 2017, Published online: 16 Jun 2017
 

ABSTRACT

Can emotional expressions automatically attract attention in virtue of their affective content? Previous studies mostly used emotional faces (e.g., angry or happy faces) in visual search tasks to assess whether affective contents can automatically attract attention. However, the evidence in support of affective attentional capture is still contentious, as the studies either: (1) did not render affective contents irrelevant to the task, (2) used affective stimuli that were perceptually similar to the target, (3) did not rule out factors occurring later in the visual search process (e.g., disengagement of attention), or (4) used only schematic emotional faces that do not clearly convey affective contents. The present study remedied these shortcomings by measuring the eye movements of observers while they searched for emotional photographic faces. To examine whether irrelevant emotional faces are selected because of their perceptual similarity to the target (top-down), or because of their emotional expressions, we also assessed the perceptual similarity between the emotional distractors and the target. The results show that happy and angry faces can indeed automatically attract attention and the gaze. Perceptual similarity modulated the effect only weakly, indicating that capture was mainly due to bottom-up, stimulus-driven processes. However, post-selectional processes of disengaging attention from the emotional expressions contributed strongly to the overall disruptive effects of emotional expressions. Taken together, these results support a stimulus-driven account of attentional capture by emotional faces, and highlight the need to use measures that can distinguish between early and late processes in visual search.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1. We also assessed the dwell times on the emotional targets to test Hodsoll et al.’s idea, that processing of negative affect may be more time-consuming and found no evidence that processing angry targets takes longer than processing of happy targets. In the set size 6 condition, the average target dwell time on happy faces was 248 ms, and on angry faces it was 247 ms, t < 1. In set size 3 condition, the gaze dwelt for 260 ms on happy targets, and for 259 ms on angry targets, t < 1. Thus, there was no support for the hypothesis that negative facial expressions require more processing time. Similarly, additional analyses of the mean number of non-target fixations and dwell times (see Appendix) did not show any effects that selectively applied to negative faces.

2. When the dwell times were computed for all fixations on the distractors, instead of only the first distractor fixations, the results were similar. In addition to the reported effects, the dwell times on the angry distractor (M = 181 ms) were longer than on the happy distractor (M = 173 ms), and this 8 ms difference was just significant, t(16) = 2.1, p= .049, whereas a similar difference between dwell times on the male distractor in search for the angry target (M = 124 ms) and the happy target (M = 132 ms) failed to reach significance, t(16) = 1.9, p = .071. However, an argument can be made that the first fixation dwell times provide a better estimate for de-allocation costs, as later fixations can be more strongly modulated by detection of the target.

3. The results of the dwell time analyses in Experiment 2 were based on average on 50 trials/cell (range: 19 to 91 trials), thus providing a much better estimate of the mean dwell times as in Experiment 1.

Additional information

Funding

This research was supported by an Australian Research Council (ARC) Future Fellowship (FT130101282), Discovery Grant (DP170102559) and a University of Queensland Foundation Research Excellence Award awarded to Stefanie I. Becker, and a Cluster of Excellence Cognitive Interaction Technology (CITEC) (EXC 277) award funded by the German Research Foundation (DFG), and DFG-Grant HO 3248/2-1 to Gernot Horstmann.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 238.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.