References
- Arnold, P., & Hill, F. (2001). Bisensory augmentation: A speechreading advantage when speech is clearly audible and intact. British Journal of Psychology, 92(2), 339–355. https://doi.org/10.1348/000712601162220
- Barenholtz, E., Mavica, L., & Lewkowicz, D. J. (2016). Language familiarity modulates relative attention to the eyes and mouth of a talker. Cognition, 147, 100–105. https://doi.org/10.1016/j.cognition.2015.11.013
- Bernstein, L. E., Auer, E. T., Eberhardt, S. P., & Jiang, J. (2013). Auditory perceptual learning for speech perception can be enhanced by audiovisual training. Frontiers in Neuroscience, 7(7 MAR), 1–16. https://doi.org/10.3389/fnins.2013.00034
- Birmingham, E, & Kingstone, A. (2009). Human social attention: A new look at past, present, and future investigations. Annals of the New York Academy of Sciences, 1156, 118–140. https://doi.org/10.1111/j.1749-6632.2009.04468.x
- Birulés, J., Bosch, L., Brieke, R., Pons, F., & Lewkowicz, D. J. (2018). Inside bilingualism: Language background modulates selective attention to a talker’s mouth. Developmental Science, https://doi.org/10.1111/desc.12755
- Borghini, G., & Hazan, V. (2018). Listening effort during sentence processing is increased for non-native listeners: A pupillometry study. Frontiers in Neuroscience, 12(MAR), 1–13. https://doi.org/10.3389/fnins.2018.00152
- Bradlow, A. R., & Alexander, J. A. (2007). Semantic and phonetic enhancements for speech-in-noise recognition by native and non-native listeners. The Journal of the Acoustical Society of America, 121(4), 2339–2349. https://doi.org/10.1121/1.2642103
- Buchan, J. N., Paré, M., & Munhall, K. G. (2007). Spatial statistics of gaze fixations during dynamic face processing. Social Neuroscience, 2(1), 1–13. https://doi.org/10.1080/17470910601043644
- Buchan, J. N., Pare, M., & Munhall, K. G. (2008). The effect of varying talker identity and listening conditions on gaze behavior during audiovisual speech perception. Brain Research, 1242, 162–171. https://doi.org/10.1016/j.brainres.2008.06.083
- Chandrasekaran, C., Trubanova, A., Stillittano, S., Caplier, A., & Ghazanfar, A. A. (2009). The natural statistics of audiovisual speech. PLoS Computational Biology, 5(7), e1000436. https://doi.org/10.1371/journal.pcbi.1000436
- Cotton, J. C. (1935). Normal “visual Hearing”. Science, 82(2138), 592–593. https://doi.org/10.1126/science.82.2138.592
- Cutler, A., Garcia Lecumberri, M. L., & Cooke, M. (2008). Consonant identification in noise by native and non-native listeners: Effects of local context. The Journal of the Acoustical Society of America, 124(2), 1264–1268. https://doi.org/10.1121/1.2946707
- Heikkilä, J., Lonka, E., Meronen, A., Tuovinen, S., Eronen, R., Leppänen, P. H., Richardson, U., Ahonen, T., & Tiippana, K. (2018). The effect of audiovisual speech training on the phonological skills of children with specific language impairment (SLI). Child Language Teaching and Therapy, 34(3), Article 026565901879369. https://doi.org/10.1177/0265659018793697
- Hyltenstam, K., & Abrahamsson, N. (2000). Who can become native-like in a second language? All, some, or none?: On the maturational constraints controversy in second language acquisition. Studia Linguistica, 54(2), 150–166. https://doi.org/10.1111/1467-9582.00056
- Imafuku, M., Kanakogi, Y., Butler, D., & Myowa, M. (2019). Demystifying infant vocal imitation: The roles of mouth looking and speaker’s gaze. Developmental Science, 22(6), e12825. https://doi.org/10.1111/desc.12825
- Iverson, P., Kuhl, P. K., Akahane-Yamadac, R., Diesch, E., Tohkura, Y., Kettermann, A., & Siebert, C. (2003). A perceptual interference account of acquisition difficulties for non-native phonemes. Cognition, 38, 361–363. https://doi.org/10.1016/S0
- Lansing, C. R., & McConkie, G. W. (2003). Word identification and eye fixation locations in visual and visual-plus-auditory presentations of spoken sentences. Perception & Psychophysics, 65(4), 536–552. https://doi.org/10.3758/BF03194581
- Lecumberri, M. L. G., Cooke, M., & Cutler, A. (2010). Non-native speech perception in adverse conditions: A review. Speech Communication, 52(11–12), 864–886. https://doi.org/10.1016/j.specom.2010.08.014
- Lewkowicz, D. J., & Hansen-Tift, A. M. (2012). Infants deploy selective attention to the mouth of a talking face when learning speech. Proceedings of the National Academy of Sciences, 109(5), 1431–1436. https://doi.org/10.1073/pnas.1114783109
- Lusk, L. G., & Mitchel, A. D. (2016). Differential gaze patterns on eyes and mouth during audiovisual speech segmentation. Frontiers in Psychology, 7(February), 52. https://doi.org/10.3389/fpsyg.2016.00052
- Macleod, A., & Summerfield, Q. (1987). Quantifying the contribution of vision to speech perception in noise. British Journal of Audiology, 21(2), 131–141. https://doi.org/10.3109/03005368709077786
- Mattys, S. L., Carroll, L. M., Li, C. K. W., & Chan, S. L. Y. (2010). Effects of energetic and informational masking on speech segmentation by native and non-native speakers. Speech Communication, 52(11–12), 887–899. https://doi.org/10.1016/j.specom.2010.01.005
- Maurer, D., & Werker, J. F. (2014). Perceptual narrowing during infancy: A comparison of language and faces. Developmental Psychobiology, 56(2), 154–178. https://doi.org/10.1002/dev.21177
- McClelland, J. L., Fiez, J. A., & McCandliss, B. D. (2002). Teaching the /r/-/l/ discrimination to Japanese adults: Behavioral and neural aspects. Physiology and Behavior, 77(4–5), 657–662. https://doi.org/10.1016/S0031-9384(02)00916-2
- McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264(5588), 746–748. https://doi.org/10.1038/264746a0
- Meredith, M. A., & Stein, B. E. (1986). Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. Journal of Neurophysiology, 56(3), 640–662. https://doi.org/10.1152/jn.1986.56.3.640
- Navarra, J., & Soto-Faraco, S. (2007). Hearing lips in a second language: Visual articulatory information enables the perception of second language sounds. Psychological Research, 71(1), 4–12. https://doi.org/10.1007/s00426-005-0031-5
- Pallier, C., Bosch, L., & Sebastián-Gallés, N. (1997). A limit on behavioral plasticity in speech perception. Cognition, 64(3), B9–B17. https://doi.org/10.1016/S0010-0277(97)00030-9
- Pons, F., Bosch, L., & Lewkowicz, D. J. (2015). Bilingualism modulates infants’ selective attention to the mouth of a talking face. Psychological Science, 26(4), 490–498. https://doi.org/10.1177/0956797614568320
- Pons, F., Sanz-Torrent, M., Ferinu, L., Birulés, J., & Andreu, L. (2018). Children with SLI can exhibit reduced attention to a talker’s mouth. Language Learning, 68, 180–192. https://doi.org/10.1111/lang.12276
- Reisberg, D. (1978). Looking where you listen: Visual cues and auditory attention. Acta Psychologica, 42(4), 331–341. https://doi.org/10.1016/0001-6918(78)90007-0
- Reisberg, D., McLean, J., & Goldfield, A. (1987). Easy to hear but hard to understand: A lip-reading advantage with intact auditory stimuli. In B. Dodd & R. Campbell (Eds.), Hearing by eye: The psychology of lip-reading (pp. 97–113). Lawrence Erlbaum Associates, Inc.
- Risberg, A., & Lubker, J. (1978). Prosody and speechreading. Quarterly Progress and Status Report, 4, 1–16. http://www.speech.kth.se/prod/publications/files/qpsr/1978/1978_19_4_001-016.pdf
- Sanders, D. A., & Goodrich, S. J. (1971). The relative contribution of visual and auditory components of speech to speech intelligibility under varying conditions of frequency distortion. Journal of Speech and Hearing Research, 14(1), 154–159. https://doi.org/10.1044/jshr.1401.154
- Sumby, W. H., & Pollack, I. (1954). Visual contribution to speech intelligibility in noise. The Journal of the Acoustical Society of America, 26(2), 212–215. https://doi.org/10.1121/1.1907309
- Summerfield, Q. (1979). Use of visual information for phonetic perception. Phonetica, 36(4–5), 314–331. https://doi.org/10.1159/000259969
- Tenenbaum, E. J., Sobel, D. M., Sheinkopf, S. J., Shah, R. J., Malle, B. F., & Morgan, J. L. (2015). Attention to the mouth and gaze following in infancy predict language development. Journal of Child Language, 42(6), 1173–1190. https://doi.org/10.1017/S0305000914000725
- Thompson, L. A., & Malloy, D. (2004). Attention resources and visible speech encoding in older and younger adults. Experimental Aging Research, 30(3), 241–252. https://doi.org/10.1080/03610730490447877
- Tsang, T., Atagi, N., & Johnson, S. P. (2018). Selective attention to the mouth is associated with expressive language skills in monolingual and bilingual infants. Journal of Experimental Child Psychology, 169, 93–109. https://doi.org/10.1016/j.jecp.2018.01.002
- Vatikiotis-Bateson, E., Eigsti, I. M., Yano, S., & Munhall, K. G. (1998). Eye movement of perceivers during audiovisual speech perception. Perception & Psychophysics, 60(6), 926–940. https://doi.org/10.3758/BF03211929
- Võ, M. L.-H., Smith, T. J., Mital, P. K., & Henderson, J. M. (2012). Do the eyes really have it? Dynamic allocation of attention when viewing moving faces. Journal of Vision, 12(13), 3–3. https://doi.org/10.1167/12.13.3
- Yarbus, A. L. (1967). Eye movements and vision. Plenum Press. Original publication in Russian, 1965.
- Yehia, H., Rubin, P., & Vatikiotis-Bateson, E. (1998). Quantitative association of vocal-tract and facial behavior. Speech Communication, 26(1–2), 23–43. https://doi.org/10.1016/S0167-6393(98)00048-X
- Young, G. S., Merin, N., Rogers, S. J., & Ozonoff, S. (2009). Gaze Behaviour and affect at 6 months: predicting clinical outcomes and language development in typically developing infants and infants at-risk for autism. Developmental Science, 12(5), 798–814. https://doi.org/10.1111/j.1467-7687.2009.00833.x