1,037
Views
13
CrossRef citations to date
0
Altmetric
Original Articles

Visual Sonority Modulates Infants’ Attraction to Sign Language

ORCID Icon, & ORCID Icon
Pages 130-148 | Received 16 Jan 2017, Accepted 06 Nov 2017, Published online: 13 Dec 2017
 

ABSTRACT

The infant brain may be predisposed to identify perceptually salient cues that are common to both signed and spoken languages. Recent theory based on spoken languages has advanced sonority as one of these potential language acquisition cues. Using a preferential looking paradigm with an infrared eye tracker, we explored visual attention of hearing 6- and 12-month-olds with no sign language experience as they watched fingerspelling stimuli that either conformed to high sonority (well-formed) or low sonority (ill-formed) values, which are relevant to syllabic structure in signed language. Younger babies showed highly significant looking preferences for well-formed, high sonority fingerspelling, while older babies showed no preference for either fingerspelling variant, despite showing a strong preference in a control condition. The present findings suggest babies possess a sensitivity to specific sonority-based contrastive cues at the core of human language structure that is subject to perceptual narrowing, irrespective of language modality (visual or auditory), shedding new light on universals of early language learning.

JEL Classification:

Acknowledgments

Data collection for the present study was conducted in the UCSD Mind, Experience, & Perception Lab (Dr. Rain Bosworth) while Stone was conducting his Ph.D. in Educational Neuroscience summer lab rotation in cognitive neuroscience; there, Stone was also the recipient of an UCSD Elizabeth Bates Graduate Research Award. We are grateful to the Petitto BL2 student and faculty research team at Gallaudet University and the student research team at the UCSD Mind, Experience, & Perception Lab. We extend our sincerest thanks to Felicia Williams, our sign model, and to the babies and families in San Diego, California, who participated in this study.

Disclosure statement

The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

Notes

1 There is debate about whether sonority is a phonological or phonetic construct. Here, we adopt the same position as Gómez et al. (Citation2014): “Our results do not speak to this debate, because we have no basis to determine whether responses of infants reflect phonetic or phonological preferences” (p. 5837). However, other studies (Berent et al., Citation2011, Citation2013) suggest sonority is at least partially phonological.

2 Some may contend that testing newborns, instead of 6-month-olds, for sensitivities to sonority constraints would offer stronger support for the Biologically-Governed Hypothesis. However, the present study with 6-month and 12-month-old infants continues to be a strong test of either hypotheses. First, it is confirmed that all infants were not systematically exposed to any visual signed language at any point in their lives. Second, the 6-month-old age criterion is significant for infants’ emerging perceptual capabilities. Newborns do have sufficient hearing and can be exposed to speech en utero. However, they have very poor sight at birth, seeing at best extremely blurry image at arms’ length. By six months, their acuity and contrast sensitivity sharpens substantially (Teller, Citation1997). Hence, this is the best age to test because it is at the very point where their vision has just markedly improved they are able to see the fine details of sign language stimuli well, and, for the first time in their lives, during this experiment. This logical reasoning should not be construed to mean that deaf infants do not need exposure to a visual signed language until they are six months old, because they do have coarse vision that is rapidly improving and is sufficient to see faces and moving hands and arms at close distances.

3 Or more accurately, “lexicalized-like,” given that these fingerspelling forms were generated specifically for the present study and were not part of the ASL lexicon at that time.

Additional information

Funding

This work was supported by the NSF Directorate for Social, Behavioral, and Economic Sciences [SBE-1041725]; Eunice Kennedy Shriver National Institute of Child Health and Human Development [F31HD087085]; and National Eye Institute [R01EY024623]. Dr. Adam Stone acknowledges with gratitude a Graduate Fellowship from the NSF Science of Learning Center Grant (SBE-1041725), a Graduate Assistantship from the Ph.D. in Educational Neuroscience (PEN) program at Gallaudet University, a NIH NRSA Fellowship (F31HD087085) to conduct his graduate training with Petitto in her Brain and Language Laboratory for Neuroimaging (BL2) at Gallaudet University, and a NIH postdoctoral supplemental award (R01EY024623, Bosworth & Dobkins, PIs). Dr. Laura-Ann Petitto gratefully acknowledges funding for this project feurom the National Science Foundation, Science of Learning Center Grant (SBE-1041725), specifically involving Petitto (PI)’s NSF-SLC project funding of her study entitled, “The impact of early visual language experience on visual attention and visual sign phonological processing in young deaf emergent readers using early reading apps: A combined eye-tracking and fNIRS brain imaging investigation.” Dr. Rain Bosworth gratefully acknowledges funding for this project from the NSF Science of Learning Center Grant (SBE-1041725), specifically involving Bosworth’s (PI) NSF-SLC project funding of her study entitled, “The temporal and spatial dynamics of visual language perception and visual sign phonology: Eye-tracking in infants and children in a perceptual discrimination experiment of signs vs gestures” and a NIH R01 award (R01EY024623, Bosworth & Dobkins, PIs) entitled, “Impact of deafness and language experience on visual development.”

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 239.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.