372
Views
0
CrossRef citations to date
0
Altmetric
REGULAR ARTICLES

The role of semantically related gestures in the language comprehension of simultaneous interpreters in noise

ORCID Icon, ORCID Icon & ORCID Icon
Pages 584-608 | Received 03 Apr 2023, Accepted 16 Apr 2024, Published online: 29 Apr 2024

References

  • AIIC. (2007). Checklist for simultaneous interpretation. https://aiic.org/document/4395/Checklist%20for%20simultaneous%20interpretation%20equipment%20-%20ENG.pdf
  • AIIC. (2019a). The AIIC A-B-C. https://aiic.org/site/world/about/profession/abc
  • AIIC. (2019b). Inside AIIC. https://aiic.org/site/world/about/inside
  • AIIC. (2020). AIIC Covid-19 Distance Interpreting Recommendations for Institutions and DI Hubs. https://aiic.org/document/4839/AIIC%20Recommendations%20for%20Institutions_27.03.2020.pdf
  • Amos, R. (2020). Prediction in interpreting [Doctoral dissertation, University of Geneva]. Archive ouverte. https://archive-ouverte.unige.ch/unige:148890
  • Anderson, L. (1994). Simultaneous interpretation: Contextual and translation aspects. In S. Lambert & B. Moser-Mercer (Eds.), Bridging the gap: Empirical research in simultaneous interpretation (pp. 101–120). John Benjamins.
  • Arbona, E., Seeber, K., & Gullberg, M. (2023). Semantically related gestures facilitate language comprehension during simultaneous interpreting. Bilingualism: Language and Cognition, 26(2), 425–439. https://doi.org/10.1017/S136672892200058X
  • Bacigalupe, L. A. (1999). Visual contact in simultaneous interpretation: Results of an experimental study. In A. Alvarez Lugris & A. Fernandez Ocampo (Eds.), Anovar/anosar. Estudios de traduccion e interpretacion (Vol. 1, pp. 123–137). Universidade de Vigo.
  • Balzani, M. (1990). Le contact visuel en interprétation simultanée: résultats d’une expérience (Français–Italien). In L. Gran & C. Taylor (Eds.), Aspects of applied and experimental research on conference interpretation (pp. 93–100). Campanotto.
  • Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1), 1–48. https://doi.org/10.18637/jss.v067.i01
  • Beattie, G., Webster, K., & Ross, J. (2010). The fixation and processing of the iconic gestures that accompany talk. Journal of Language and Social Psychology, 29(2), 194–213. https://doi.org/10.1177/0261927X09359589
  • Bühler, H. (1985). Conference interpreting: A multichannel communication phenomenon. Meta: Journal des Traducteurs, 30(1), 49–54. https://doi.org/10.7202/002176ar
  • Caniato, A. (2021). RSI sound myth buster: Ten misconceptions that result in RSI sounding terrible. https://aiic.org/site/blog/RSI-sound-myth-buster?language=fr_FR&
  • CAPE. (2021). CAPE survey confirms continued Parliamentary interpreters’ health and safety risks a year into the pandemic. https://www.acep-cape.ca/en/news/cape-survey-confirms-continued-parliamentary-interpreters-health-and-safety-risks-year
  • Church, R. B., Garber, P., & Rogalski, K. (2007). The role of gesture in memory and social communication. Gesture, 7(2), 137–158. https://doi.org/10.1075/gest.7.2.02bre
  • Dahl, T. I., & Ludvigsen, S. (2014). How I see what you're saying: The role of gestures in native and foreign language listening comprehension. The Modern Language Journal, 98(3), 813–833. 10.1111modl.12124
  • Davies, M. (2008). The corpus of contemporary American English (COCA): One billion words, 1990–2019. http://corpus.byu.edu/coca/
  • Debras, C. (2017). The shrug: Forms and meanings of a compound enactment. Gesture, 16(1), 1–34. https://doi.org/10.1075/gest.16.1.01deb
  • Dreschler, W. A., Verschuure, H., Ludvigsen, C., & Westermann, S. (2001). ICRA noises: Artificial noise signals with speech-like spectral and temporal properties for hearing instrument assessment. International Collegium for Rehabilitative Audiology, 40(3), 148–157. https://doi.org/10.3109/00206090109073110
  • Drijvers, L., & Özyürek, A. (2017). Visual context enhanced: The joint contribution of iconic gestures and visible speech to degraded speech comprehension. Journal of Speech, Language, and Hearing Research, 60(1), 212–222. https://doi.org/10.1044/2016_JSLHR-H-16-0101
  • Drijvers, L., & Özyürek, A. (2020). Non-native listeners benefit less from gestures and visible speech than native listeners during degraded speech comprehension. Language and Speech, 63(2), 209–220. https://doi.org/10.1177/0023830919831311
  • Drijvers, L., Vaitonytė, J., & Özyürek, A. (2019). Degree of language experience modulates visual attention to visible speech and iconic gestures during clear and degraded speech comprehension. Cognitive Science, 43(10), e12789. https://doi.org/10.1111/cogs.12789
  • Emmorey, K., & Özyürek, A. (2014). Language in our hands: Neural underpinnings of sign language and co-speech gesture. In M. S. Gazzaniga & G. R. Mangun (Eds.), The cognitive neurosciences (5th ed., pp. 657–666). MIT Press.
  • Emmorey, K., Thompson, R., & Colvin, R. (2008). Eye gaze during comprehension of American sign language by native and beginning signers. The Journal of Deaf Studies and Deaf Education, 14(2), 237–243. https://doi.org/10.1093/deafed/enn037
  • Galvão, E. Z. (2013). Hand gestures and speech production in the booth: Do simultaneous interpreters imitate the speaker? In C. Carapinha & I. A. Santos (Eds.), Estudos de linguística (Vol. II, pp. 115–130). Imprensa da Universidade de Coimbra.
  • Garone, A. (2021). Perceived impact of remote simultaneous interpreting on auditory health public data set. Mendeley Data, V1. 10.17632jvmf9gzpw8.1
  • Gerver, D. (1974). Simultaneous listening and speaking and retention of prose. Quarterly Journal of Experimental Psychology, 26(3), 337–341. https://doi.org/10.1080/14640747408400422
  • Gieshoff, A.-C. (2018). The impact of audio-visual speech input on work-load in simultaneous interpreting [Doctoral dissertation, Johannes Gutenberg-Universität Mainz]. Gutenberg Open Science. https://openscience.ub.uni-mainz.de/bitstream/20.500.12030/2182/1/100002183.pdf
  • Gullberg, M., & Holmqvist, K. (1999). Keeping an eye on gestures: Visual perception of gestures in face-to-face communication. Pragmatics & Cognition, 7(1), 35–63. https://doi.org/10.1075/pc.7.1.04gul
  • Gullberg, M., & Holmqvist, K. (2006). What speakers do and what addressees look at: Visual attention to gestures in human interaction live and on video. Pragmatics & Cognition, 14(1), 53–82. https://doi.org/10.1075/pc.14.1.05gul
  • Gullberg, M., & Kita, S. (2009). Attention to speech-accompanying gestures: Eye movements and information uptake. Journal of Nonverbal Behavior, 33(4), 251–277. https://doi.org/10.1007/s10919-009-0073-2
  • Hervais-Adelman, A., Moser-Mercer, B., & Golestani, N. (2015). Brain functional plasticity associated with the emergence of expertise in extreme language control. Neuroimage, 114, 264–274. https://doi.org/10.1016/j.neuroimage.2015.03.072
  • Holle, H., & Gunter, T. C. (2007). The role of iconic gestures in speech disambiguation: ERP evidence. Journal of Cognitive Neuroscience, 19(7), 1175–1192. https://doi.org/10.1162/jocn.2007.19.7.1175
  • Holle, H., Obleser, J., Rueschemeyer, S. A., & Gunter, T. C. (2010). Integration of iconic gestures and speech in left superior temporal areas boosts speech comprehension under adverse listening conditions. Neuroimage, 49(1), 875–884. https://doi.org/10.1016/j.neuroimage.2009.08.058
  • Hostetter, A. B. (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 137(2), 297–315. https://doi.org/10.1037/a0022128
  • Hyönä, J., Tommola, J., & Alaja, A.-M. (1995). Pupil dilation as a measure of processing load in simultaneous interpretation and other language tasks. The Quarterly Journal of Experimental Psychology, 48(3), 598–612. https://doi.org/10.1080/14640749508401407
  • Ibáñez, A., Manes, F., Escobar, J., Trujillo, N., Andreucci, P., & Hurtado, E. (2010). Gesture influences the processing of figurative language in non-native speakers: ERP evidence. Neuroscience Letters, 471(1), 48–52. https://doi.org/10.1016/j.neulet.2010.01.009
  • International Organization for Standardization. (2016a). Simultaneous interpreting – Mobile booths – Requirements (ISO Standard no. 4043:2016). https://www.iso.org/standard/67066.html
  • International Organization for Standardization. (2016b). Simultaneous interpreting – Permanent booths – Requirements (ISO Standard no. 2603:2016). https://www.iso.org/standard/67065.html
  • International Organization for Standardization. (2017). Simultaneous interpreting – Quality and transmission of sound and image input – Requirements (ISO Standard no. 20108:2017). https://www.iso.org/standard/67062.html
  • Isham, W. P. (1994). Memory for sentence form after simultaneous interpretation: Evidence both for and against deverbalization. In S. Lambert & B. Moser-Mercer (Eds.), Bridging the gap: Empirical research in simultaneous interpretation (pp. 191–211). John Benjamins.
  • Jesse, A., Vrignaud, N., Cohen, M. M., & Massaro, D. W. (2000). The processing of information from multiple sources in simultaneous interpreting. Interpreting. International Journal of Research and Practice in Interpreting, 5(2), 95–115. https://doi.org/10.1075/intp.5.2.04jes
  • Kang, S., Hallman, G. L., Son, L. K., & Black, J. B. (2013). The different benefits from different gestures in understanding a concept. Journal of Science Education and Technology, 22(6), 825–837. https://doi.org/10.1007/s10956-012-9433-5
  • Kelly, S. D. (2017). Exploring the boundaries of gesture-speech integration during language comprehension. In R. B. Church, M. W. Alibali, & S. D. Kelly (Eds.), Why gesture? How the hands function in speaking, thinking and communicating (pp. 243–265). John Benjamins. https://doi.org/10.1075/gs.7.12kel
  • Kelly, S. D., Creigh, P., & Bartolotti, J. (2010a). Integrating speech and iconic gestures in a Stroop-like task: Evidence for automatic processing. Journal of Cognitive Neuroscience, 22(4), 683–694. https://doi.org/10.1162/jocn.2009.21254
  • Kelly, S. D., Healey, M., Özyürek, A., & Holler, J. (2015). The processing of speech, gesture, and action during language comprehension. Psychonomic Bulletin & Review, 22(2), 517–523. https://doi.org/10.3758/s13423-014-0681-7
  • Kelly, S. D., Ozyürek, A., & Maris, E. (2010b). Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science, 21(2), 260–267. https://doi.org/10.1177/0956797609357327
  • Kendon, A. (1995). Gestures as illocutionary and discourse structure markers in Southern Italian conversation. Journal of Pragmatics, 23(3), 247–279. https://doi.org/10.1016/0378-2166(94)00037-F
  • Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge University Press.
  • Król, M. E. (2018). Auditory noise increases the allocation of attention to the mouth, and the eyes pay the price: An eye-tracking study. PLoS One, 13(3), e0194491. https://doi.org/10.1371/journal.pone.0194491
  • Marian, V., Blumenfeld, H. K., & Kaushanskaya, M. (2007). The Language Experience and Proficiency Questionnaire (LEAP-Q): Assessing language profiles in bilinguals and multilinguals. Journal of Speech, Language, and Hearing Research, 50(4), 940–967. https://doi.org/10.1044/1092-4388(2007/067)
  • McNeill, D. (1985). So you think gestures are nonverbal? Psychological Review, 92(3), 350–371. https://doi.org/10.1037/0033-295X.92.3.350
  • McNeill, D. (1992). Hand and mind: What gestures reveal about thought. University of Chicago Press.
  • McNeill, D. (2005). Gesture and thought. University of Chicago Press. https://doi.org/10.7208/chicago/9780226514642.001.0001
  • Moser-Mercer, B., Frauenfelder, U., Casado, B., & Künzli, A. (2000). Searching to define expertise in interpreting. In B. Englund Dimitrova & K. Hyltenstam (Eds.), Language processing and simultaneous interpreting: Interdisciplinary perspectives (pp. 107–131). John Benjamins.
  • Nash, J. C. (2014). On best practice optimization methods in R. Journal of Statistical Softwares, 60(2), 1-14. https://doi.org/10.18637/jss.v060.i02
  • Nour, S., Struys, E., & Stengers, H. (2019). Attention network in interpreters: The role of training and experience. Behavioral Sciences, 9(4), 43. https://doi.org/10.3390/bs9040043
  • Obermeier, C., Dolk, T., & Gunter, T. C. (2012). The benefit of gestures during communication: Evidence from hearing and hearing-impaired individuals. Cortex, 48(7), 857–870. https://doi.org/10.1016/j.cortex.2011.02.007
  • Özer, D., & Göksun, T. (2020). Visual-spatial and verbal abilities differentially affect processing of gestural vs. spoken expressions. Language, Cognition and Neuroscience, 35(7), 896–914. https://doi.org/10.1080/23273798.2019.1703016
  • Özer, D., Karadöller, D. Z., Özyürek, A., & Göksun, T. (2023). Gestures cued by demonstratives in speech guide listeners’ visual attention during spatial language comprehension. Journal of Experimental Psychology: General, 152(9), 2623–2635. https://doi.org/10.1037/xge0001402
  • Özyürek, A. (2014). Hearing and seeing meaning in speech and gesture: Insights from brain and behaviour. Philosophical Transactions of the Royal Society B: Biological Sciences, 369(1651), 20130296. https://doi.org/10.1098/rstb.2013.0296
  • Özyürek, A. (2018). Role of gesture in language processing: Toward a unified account for production and comprehension. In S.-A. Rueschemeyer & M. G. Gaskell (Eds.), Oxford handbook of psycholinguistics (2nd ed., pp. 592–607). Oxford University Press.
  • Özyürek, A., Willems, R. M., Kita, S., & Hagoort, P. (2007). On-line integration of semantic information from speech and gesture: Insights from event-related brain potentials. Journal of Cognitive Neuroscience, 19(4), 605–616. https://doi.org/10.1162/jocn.2007.19.4.605
  • R Core Team. (2013). R: A language and environment for statistical computing. http://www.R-project.org/
  • Rennert, S. (2008). Visual input in simultaneous interpreting. Meta, 53(1), 204–217. https://doi.org/10.7202/017983ar
  • Rennig, J., Wegner-Clemens, K., & Beauchamp, M. S. (2020). Face viewing behavior predicts multisensory gain during speech perception. Psychonomic Bulletin & Review, 27(1), 70–77. https://doi.org/10.3758/s13423-019-01665-y
  • Riccardi, A. (1996). Language-Specific strategies in simultaneous interpreting. In C. Dollerup & V. Appel (Eds.), New horizons – teaching translation and interpreting (pp. 213–222). John Benjamins.
  • Rimé, B., Boulanger, B., & d'Ydevalle, G. (1988, August 28–September 2). Visual attention to the communicator's nonverbal behavior as a function of the intelligibility of the message [paper presentation]. 24th International Congress of Psychology, Sydney, Australia.
  • Rogers, W. T. (1978). The contribution of kinesic illustrators toward the comprehension of verbal behavior within utterances. Human Communication Research, 5(1), 54–62. https://doi.org/10.1111/j.1468-2958.1978.tb00622.x
  • Seeber, K. G. (2011, May 12-14). Multimodal input in simultaneous interpreting. An eye-tracking experiment [paper presentation]. 1st International Conference TRANSLATA, Translation & Interpreting Research: Yesterday – Today – Tomorrow, Innsbruck, Austria.
  • Seeber, K. G. (2017). Multimodal processing in simultaneous interpreting. In J. W. Schwieter & A. Ferreira (Eds.), The handbook of translation and cognition (pp. 461–475). Wiley Blackwell. https://doi.org/10.1002/9781119241485.ch25
  • Seeber, K. G., & Fox, B. (2021). Distance conference interpreting. In M. Albl-Mikasa, & E. Tiselius (Eds.), The Routledge handbook of conference interpreting (pp. 491–507). Routledge. https://doi.org/10.4324/9780429297878
  • Snodgrass, J. G., & Vanderwart, M. (1980). A standardized set of 260 pictures: Norms for name agreement, image agreement, familiarity, and visual complexity. Journal of Experimental Psychology: Human Learning and Memory, 6(2), 174–215. https://doi.org/10.1037/0278-7393.6.2.174
  • Spinolo, N., & Chmiel, A. (2021). Inside the virtual booth: the impact of remote interpreting settings on interpreter experience and performance. https://www.aiic.org.br/site/blog/the-virtual-booth
  • Stachowiak-Szymczak, K. (2019). Eye movements and gestures in simultaneous and consecutive interpreting. Springer.
  • Streeck, J. (2008). Gesture in political communication: A case study of the democratic presidential candidates during the 2004 primary campaign. Research on Language and Social Interaction, 41(2), 154–186. https://doi.org/10.1080/08351810802028662
  • Strobach, T., Becker, M., Schubert, T., & Kühn, S. (2015). Better dual-task processing in simultaneous interpreters. Frontiers in Psychology, 6, 1590–1590. https://doi.org/10.3389/fpsyg.2015.01590
  • Sueyoshi, A., & Hardison, D. M. (2005). The role of gestures and facial cues in second language listening comprehension. Language Learning, 55(4), 661–699. https://doi.org/10.1111/j.0023-8333.2005.00320.x
  • Szekely, A., Jacobsen, T., D'Amico, S., Devescovi, A., Andonova, E., Herron, D., Lu, C. C., Pechmann, T., Pl��h, C., Wicha, N., Federmeier, K., Gerdjikova, I., Gutierrez, G., Hung, D., Hsu, J., Iyer, G., Kohnert, K., Mehotcheva, T., Orozco-Figueroa, A., … Bates, E. (2004). A new on-line resource for psycholinguistic studies. Journal of Memory and Language, 51(2), 247–250. https://doi.org/10.1016/j.jml.2004.03.002
  • Tommola, J., & Lindholm, J. (1995). Experimental research on interpreting: Which dependent variable? In J. Tommola (Ed.), Topics in interpreting research (pp. 121–133). University of Turku.
  • Vatikiotis-Bateson, E., Eigsti, I. M., Yano, S., & Munhall, K. G. (1998). Eye movement of perceivers during audiovisual speech perception. Perception & Psychophysics, 60(6), 926–940. https://doi.org/10.3758/BF03211929
  • Yudes, C., Macizo, P., & Bajo, T. (2011). The influence of expertise in simultaneous interpreting on non-verbal executive processes. Frontiers in Psychology, 2, 309–309. https://doi.org/10.3389/fpsyg.2011.00309