Publication Cover
Experimental Aging Research
An International Journal Devoted to the Scientific Study of the Aging Process
Volume 30, 2004 - Issue 3
146
Views
16
CrossRef citations to date
0
Altmetric
Original Articles

Attention Resources and Visible Speech Encoding in Older and Younger Adults

&
Pages 241-252 | Received 01 Jun 2003, Accepted 01 Oct 2003, Published online: 17 Aug 2010
 

Abstract

Two experiments investigated adult age differences in the distribution of attention across a speaker's face during auditory-visual language processing. Dots were superimposed on the faces of speakers for 17-ms presentations, and participants reported the spatial locations of the dots. In Experiment 1, older adults showed relatively better detection performance at the mouth area than the eye area compared to younger adults. In Experiment 2, in the absence of audible language, both age groups did not differentially focus on the mouth area. The results are interpreted in light of Massaro's (1998, Perceiving talking faces: From speech perception to a behavioral principle. Cambridge, MA: MIT Press) theoretical framework for understanding auditory-visual speech perception. It is claimed that older adults’ greater reliance on visible speech is due to a reallocation of resources away from the eyes and toward the mouth area of the face.

This research was supported by the NIH MBRS RISE program (no. 6M612222-02). The authors wish to thank AI Moreno and Denise Welsh for technical assistance and Jeanne Malmberg for her valuable input.

Notes

1The result, in both experiments, of better detection of dots on the speaker's left side is in line with several recent studies. Speechreading studies using free fixation of static photographs (CitationBurt & Perrett, 1997) and laterality studies using bimodal syllable presentations (CitationSmeele, Massaro, Cohen, & Sittig, 1998) have also shown a right visual field advantage. These findings may be the result of left-hemisphere dominance for language processing or left-hemisphere dominance for dynamic facial movements (CitationSmeele et al., 1998). The fact that participants showed better detection on this side even in the absence of speech is also consistent with a neuropsychological investigation showing activation of auditory cortex when lip reading without auditory stimulation (CitationCalvert et al., 1997). However, the connection to laterality research must be made cautiously because no attempt was made to control for eye movements in the present study.

2When the data published in Thompson (Citation1995, Experiment 1) were fit to the Fuzzy Logical Model of Perception (FLMP), the FLMP's predictions fit the data better than the predictions of any other model tested, including the model that assumed selective attention to visible speech, and the parameter values representing the quality of the visible speech information processed were more discriminable in the older adult group. (See CitationThompson and Lee [1996] for mathematical descriptions of the models.) For the FLMP, the Root Mean Squared Deviation (RMSD) values between observed and predicted data points were as follows: .052 for young adults and .041 for older adults. RMSD values for the Selective Attention model were: .110 for young adults and .123 for older adults. The FLMP parameter values for the visual dimension were: .284, .382, .614, .625, .749 (younger adults) and .133, .292, .456, .559, .645 (older adults).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.