1,689
Views
5
CrossRef citations to date
0
Altmetric
Original Articles

Language-induced visual and semantic biases in visual search are subject to task requirements

, &
Pages 225-240 | Received 19 Dec 2016, Accepted 23 Apr 2017, Published online: 25 May 2017

References

  • Bae, G. Y., & Luck, S. J. (2016). Two ways to remember: Properties of visual representations in active and passive working memory. Paper presented at the VSS annual meeting, St. Pete Beach, FL.
  • Belke, E., Humphreys, G. W., Watson, D. G., Meyer, A. S., & Telling, A. L. (2008). Top-down effects of semantic knowledge in visual search are modulated by cognitive but not perceptual load. Perception & Psychophysics, 70(8), 1444–1458. doi: 10.3758/PP.70.8.1444
  • Christophel, T. B., Klink, P. C., Spitzer, B., Roelfsema, P. R., & Haynes, J. D. (2017). The distributed nature of working memory. Trends in Cognitive Sciences, 21(2), 111–124. doi: 10.1016/j.tics.2016.12.007
  • Cooper, R. M. (1974). The control of eye fixation by the meaning of spoken language: A new methodology for the real-time investigation of speech perception, memory, and language processing. Cognitive Psychology, 6(1), 84–107. doi: 10.1016/0010-0285(74)90005-X
  • Cousineau, D. (2005). Confidence intervals in within-subject designs: A simpler solution to Loftus and Masson’s method. Tutorials in Quantitative Methods for Psychology, 1(1), 42–45. doi: 10.20982/tqmp.01.1.p042
  • Dahan, D., & Tanenhaus, M. K. (2005). Looking at the rope when looking for the Snake: Conceptually mediated eye movements during spoken-word recognition. Psychonomic Bulletin & Review, 12(3), 453–459. doi: 10.3758/BF03193787
  • de Groot, F., Huettig, F., & Olivers, C. N. L. (2016a). When meaning matters: The temporal dynamics of semantic influences on visual attention. Journal of Experimental Psychology: Human Perception and Performance, 42(2), 180–196. doi: 10.1037/xhp0000102
  • de Groot, F., Huettig, F., & Olivers, C. N. L. (2016b). Revisiting the looking at nothing phenomenon: Visual and semantic biases in memory search. Visual Cognition, 24(3), 226–245. doi: 10.1080/13506285.2016.1221013
  • de Groot, F., Koelewijn, T., Huettig, F., & Olivers, C. N. L. (2016). A stimulus set of words and pictures matched for visual and semantic similarity. Journal of Cognitive Psychology, 28(1), 1–15. doi: 10.1080/20445911.2015.1101119
  • Gunseli, E., Olivers, C. N. L., & Meeter, M. (2016). Task-irrelevant memories rapidly gain attentional control with learning. Journal of Experimental Psychology: Human Perception and Performance, 42(3), 354–362.
  • Huettig, F., & Altmann, G. T. M. (2005). Word meaning and the control of eye fixation: Semantic competitor effects and the visual world paradigm. Cognition, 96(1), B23–B32. doi: 10.1016/j.cognition.2004.10.003
  • Huettig, F., & Altmann, G. T. M. (2007). Visual-shape competition during language-mediated attention is based on lexical input and not modulated by contextual appropriateness. Visual Cognition, 15(8), 985–1018. doi: 10.1080/13506280601130875
  • Huettig, F., & McQueen, J. M. (2007). The tug of war between phonological, semantic and shape information in language-mediated visual search. Journal of Memory and Language, 57(4), 460–482. doi: 10.1016/j.jml.2007.02.001
  • Huettig, F., Olivers, C. N. L., & Hartsuiker, R. J. (2011). Looking, language, and memory: Bridging research from the visual world and visual search paradigms. Acta Psychologica, 137(2), 138–150. doi: 10.1016/j.actpsy.2010.07.013
  • Huettig, F., Rommers, J., & Meyer, A. S. (2011). Using the visual world paradigm to study language processing: A review and critical evaluation. Acta Psychologica, 137(2), 151–171. doi: 10.1016/j.actpsy.2010.11.003
  • Logan, G. D. (1988). Toward an instance theory of automatization. Psychological Review, 95(4), 492–527. doi: 10.1037/0033-295X.95.4.492
  • Mathôt, S., Schreij, D., & Theeuwes, J. (2012). Opensesame: An open-source, graphical experiment builder for the social sciences. Behavior Research Methods, 44(2), 314–324. doi: 10.3758/s13428-011-0168-7
  • Mayberry, M. R., Crocker, M. W., & Knoeferle, P. (2009). Learning to attend: A connectionist model of situated language comprehension. Cognitive Science, 33, 449–496. doi: 10.1111/j.1551-6709.2009.01019.x
  • McQueen, J. M., & Huettig, F. (2014). Interference of spoken word recognition through phonological priming from visual objects and printed words. Attention, Perception, & Psychophysics, 76(1), 190–200. doi: 10.3758/s13414-013-0560-8
  • Mirman, D., & Magnuson, J. S. (2009). Dynamics of activation of semantically similar concepts during spoken word recognition. Memory & Cognition, 37, 1026–1039. doi: 10.3758/MC.37.7.1026
  • Mishra, R. K., Olivers, C. N. L., & Huettig, F. (2013). Spoken language and the decision to move the eyes: To what extent are language-mediated eye movements automatic? In C. Pammi & N. Srinivasan (Eds.), Progress in brain research (Vol. 202, pp. 135–149). Amsterdam: Elsevier.
  • Moores, E., Laiti, L., & Chelazzi, L. (2003). Associative knowledge controls deployment of visual selective attention. Nature Neuroscience, 6(2), 182–189. doi: 10.1038/nn996
  • Moors, A., & De Houwer, J. (2006). Automaticity: A theoretical and conceptual analysis. Psychological Bulletin, 132, 297–326. doi: 10.1037/0033-2909.132.2.297
  • Morey, R. D. (2008). Confidence intervals from normalized data: A correction to Cousineau (2005). Tutorial in Quantitative Methods for Psychology, 4(2), 61–64. doi: 10.20982/tqmp.04.2.p061
  • Navalpakkam, V., & Itti, L. (2005). Modeling the influence of task on attention. Vision Research, 45(2), 205–231. doi: 10.1016/j.visres.2004.07.042
  • Olivers, C. N. L. (2009). What drives memory-driven attentional capture? The effects of memory type, display type, and search type. Journal of Experimental Psychology: Human Perception and Performance, 35(5), 1275–1291. doi: 10.1037/a0013896
  • Olivers, C. N. L., Peters, J., Houtkamp, R., & Roelfsema, P. R. (2011). Different states in visual working memory: When it guides attention and when it does not. Trends in Cognitive Sciences, 15(7), 327–334. doi: 10.1016/j.tics.2011.05.004
  • Rommers, J., Meyer, A. S., Praamstra, P., & Huettig, F. (2013). The contents of predictions in sentence comprehension: Activation of the shape of objects before they are referred to. Neuropsychologia, 51(3), 437–447. doi: 10.1016/j.neuropsychologia.2012.12.002
  • Salverda, A. P., & Altmann, G. T. M. (2011). Attentional capture of objects referred to by spoken language. Journal of Experimental Psychology: Human Perception and Performance, 37(4), 1122–1133. doi: 10.1037/a0023101
  • Schmidt, J., & Zelinsky, G. J. (2009). Search guidance is proportional to the categorical specificity of a target cue. The Quarterly Journal of Experimental Psychology, 62(10), 1904–1914. doi: 10.1080/17470210902853530
  • Schmidt, J., & Zelinsky, G. J. (2011). Visual search guidance is best after a short delay. Vision Research, 51(6), 535–545. doi: 10.1016/j.visres.2011.01.013
  • Smith, A. C., Monaghan, P., & Huettig, F. (2013). An amodal shared resource model of language-mediated visual attention. Frontiers in Psychology, 4, 528. doi: 10.3389/fpsyg.2013.00528
  • Soto, D., & Humphreys, G. W. (2007). Automatic guidance of visual attention from verbal working memory. Journal of Experimental Psychology: Human Perception and Performance, 33(3), 730–737. doi: 10.1037/0096-1523.33.3.730
  • Spivey, M. (2008). The continuity of mind. Vol. 40. New York, NY: Oxford University Press.
  • Tanenhaus, M. K., Spivey-Knowlton, M. J., Eberhard, K. M., & Sedivy, J. C. (1995). Integration of visual and linguistic information in spoken language comprehension. Science, 268(5217), 1632–1634. doi: 10.1126/science.7777863
  • Telling, A. L., Kumar, S., Meyer, A. S., & Humphreys, G. W. (2010). Electrophysiological evidence of semantic interference in visual search. Journal of Cognitive Neuroscience, 22(10), 2212–2225. doi: 10.1162/jocn.2009.21348
  • Telling, A. L., Meyer, A. S., & Humphreys, G. W. (2010). Distracted by relatives: Effects of frontal lobe damage on semantic distraction. Brain and Cognition, 73(3), 203–214. doi: 10.1016/j.bandc.2010.05.004
  • Walenchok, S. C., Hout, M. C., & Goldinger, S. D. (2016). Implicit object naming in visual search: Evidence from phonological competition. Attention, Perception, & Psychophysics, 78(8), 2633–2654. doi: 10.3758/s13414-016-1184-6
  • Wolfe, J. M. (2007). Guided search 4.0: Current progress with a model of visual search. In W. Gray (Ed.), Integrated models of cognitive systems (pp. 99–119). New York, NY: Oxford University Press.
  • Wolfe, J. M., Horowitz, T. S., Kenner, N., Hyle, M., & Vasan, N. (2004). How fast can you change your mind? The speed of top-down guidance in visual search. Vision Research, 44(12), 1411–1426. doi: 10.1016/j.visres.2003.11.024
  • Zelinsky, G. J. (2008). A theory of eye movements during target acquisition. Psychological Review, 115(4), 787–835. doi: 10.1037/a0013118