376
Views
27
CrossRef citations to date
0
Altmetric
Original Articles

The real-time link between person perception and action: Brain potential evidence for dynamic continuity

, , &
Pages 139-155 | Received 16 Feb 2010, Accepted 23 Apr 2010, Published online: 02 Jul 2010
 

Abstract

Using event-related potentials, we investigated how the brain extracts information from another's face and translates it into relevant action in real time. In Study 1, participants made between-hand sex categorizations of sex-typical and sex-atypical faces. Sex-atypical faces evoked negativity between 250 and 550 ms (N300/N400 effects), reflecting the integration of accumulating sex-category knowledge into a coherent sex-category interpretation. Additionally, the lateralized readiness potential revealed that the motor cortex began preparing for a correct hand response while social category knowledge was still gradually evolving in parallel. In Study 2, participants made between-hand eye-color categorizations as part of go/no-go trials that were contingent on a target's sex. On no-go trials, although the hand did not actually move, information about eye color partially prepared the motor cortex to move the hand before perception of sex had finalized. Together, these findings demonstrate the dynamic continuity between person perception and action, such that ongoing results from face processing are immediately and continuously cascaded into the motor system over time. The preparation of action begins based on tentative perceptions of another's face before perceivers have finished interpreting what they just saw.

Acknowledgments

This research was funded by research grants NSF BCS-0435547 to NA and NIH-NICHD HD25889 to PJH. This research was part of a master's thesis submitted by JBF to the Department of Psychology at Tufts University. We thank Janelle LaMarche, Jeffrey Brooks, Rebecca Sylvetsky, and Jordana Jacobs for their help in data collection.

Notes

1A face's sex-category, as outlined in Bruce and Young's (Citation1986) classic model of face perception, is a type of “visually-derived semantic information.” Other types of semantic representation triggered by faces would be race-category or emotion-category, among others.

2Although it is proposed that an ongoing cascade of information continuously passes between the neural subsystems for perception, cognition, and action, some shifts in mental processing can manifest extremely rapidly and nonlinearly so that they begin to approximate discrete-like transitions at some levels of analysis (e.g., “phase transitions”; beim Graben, Citation2004; Dale & Spivey, Citation2005; Spivey, Anderson, & Dale, Citation2009). Regardless, however, a genuine discontinuity in the system would be extremely unlikely. That said, because at some coarser levels of analysis shifts in mental processing may exhibit rapid nonlinearities that approximate discrete-like transitions, approaches emphasizing serial processing and discrete–symbolic representations may be valuable in some contexts. Moreover, such approaches are not necessarily irreconcilable with dynamic approaches emphasizing parallel–continuous processing and distributed representations (Dale, Dietrich, & Chemero, Citation2009).

3Analysis of the data from Study 1 targeting different components (the N170 and P1) at earlier epochs (between 100 and 190 ms) is reported elsewhere (Freeman, Ambady, & Holcomb, Citation2010a). That study focused on basic visual encoding of sex-category facial content.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.