Abstract
We used functional magnetic resonance imaging to identify the neural regions that support comprehension of fingerspelled words, printed words, and American Sign Language (ASL) signs in deaf ASL–English bilinguals. Participants made semantic judgements (concrete–abstract?) to each lexical type, and hearing non-signers served as controls. All three lexical types engaged a left frontotemporal circuit associated with lexical semantic processing. Both printed and fingerspelled words activated the visual word form area for deaf signers only. Fingerspelled words were more left lateralised than signs, paralleling the difference between reading and listening for spoken language. Greater activation in left supramarginal gyrus was observed for signs compared to fingerspelled words, supporting its role in processing sign-specific phonology. Fingerspelling ability was negatively correlated with activation in left occipital cortex, while ASL ability was negatively correlated with activation in right angular gyrus. Overall, the results reveal both overlapping and distinct neural regions for comprehension of signs, text, and fingerspelling.
Acknowledgements
We would like to thank all of our participants and Allison Bassett, Lucinda O'Grady, and Jen Petrich for their assistance with the study.
Disclosure statement
No potential conflict of interest was reported by the authors.
Notes
1. By convention, fingerspelled words are written with hyphenated capital letters. ASL signs are denoted by their English translation written in all capital letters without hyphens.