142
Views
1
CrossRef citations to date
0
Altmetric
Forthcoming Special Issue on: Visual Search and Selective Attention

Distractor rejection in parallel search tasks takes time but does not benefit from context repetitionFootnote*

, , , &
Pages 609-625 | Received 18 Mar 2019, Accepted 27 Sep 2019, Published online: 10 Oct 2019

References

  • Annac, E., Pointner, M., Khader, P. H., Müller, H. J., Zang, X., & Geyer, T. (2019). Recognition of incidentally learned visual search arrays is supported by fixational eye movements. Journal of Experimental Psychology: Learning, Memory, and Cognition. Advance online publication. doi: 10.1037/xlm0000702
  • Beesley, T., Hanafi, G., Vadillo, M. A., Shanks, D. R., & Livesey, E. J. (2018). Overt attention in contextual cuing of visual search is driven by the attentional set, but not by the predictiveness of distractors. Journal of Experimental Psychology: Learning Memory and Cognition, 44(5), 707–721. doi: 10.1037/xlm0000467
  • Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Vision, 10, 433–436.
  • Bravo, M. J., & Nakayama, K. (1992). The role of attention in different visual-search tasks. Perception & Psychophysics, 51(5), 465–472.
  • Buetti, S., Cronin, D. A., Madison, A. M., Wang, Z., & Lleras, A. (2016). Towards a better understanding of parallel visual processing in human vision: Evidence for exhaustive analysis of visual information. Journal of Experimental Psychology: General, 145(6), 672–707. doi: 10.1037/xge0000163
  • Chun, M. M. (2000). Contextual cueing of visual attention. Trends in Cognitive Sciences, 4(5), 170–178. doi: 10.1016/S1364-6613(00)01476-5
  • Chun, M. M., & Jiang, Y. V. (1998). Contextual cueing: Implicit learning and memory of visual context guides spatial attention. Cognitive Psychology, 36(1), 28–71. doi: 10.1006/cogp.1998.0681
  • Chun, M. M., & Jiang, Y. V. (1999). Top-down attentional guidance based on implicit learning of visual covariation. Psychological Science, 10(4), 360–365. doi: 10.1111/1467-9280.00168
  • Colagiuri, B., & Livesey, E. J. (2016). Contextual cuing as a form of nonconscious learning: Theoretical and empirical analysis in large and very large samples. Psychonomic Bulletin and Review, 23(6), 1996–2009. doi: 10.3758/s13423-016-1063-0
  • Duncan, J., & Humphreys, G. W. (1989). Visual search and stimulus similarity. Psychological Review, 96(3), 433–458. doi: 10.1037/0033-295X.96.3.433
  • Eckstein, M. P., Thomas, J. P., Palmer, J., & Shimozaki, S. S. (2000). A signal detection model predicts the effects of set size on visual search accuracy for feature, conjunction, triple conjunction, and disjunction displays. Perception and Psychophysics, 62(3), 425–451. doi: 10.3758/BF03212096
  • Feldmann-Wüstefeld, T., & Schubö, A. (2014). Stimulus homogeneity enhances implicit learning: Evidence from contextual cueing. Vision Research, 97, 108–116. doi: 10.1016/j.visres.2014.02.008
  • Geyer, T., Shi, Z., & Müller, H. J. (2010). Contextual cueing in multiconjunction visual search is dependent on color- and configuration-based intertrial contingencies. Journal of Experimental Psychology: Human Perception and Performance, 36(3), 515–532. doi: 10.1037/a0017448
  • Geyer, T., Zehetleitner, M., & Müller, H. J. (2010). Contextual cueing of pop-out visual search: When context guides the deployment of attention. Journal of Vision, 10(2010), 20. doi: 10.1167/10.5.20
  • Goujon, A., Didierjean, A., & Marmèche, E. (2007). Contextual cueing based on specific and categorical properties of the environment. Visual Cognition, 15(3), 257–275. doi: 10.1080/13506280600677744
  • Goujon, A., Didierjean, A., & Thorpe, S. (2015). Investigating implicit statistical learning mechanisms through contextual cueing. Trends in Cognitive Sciences, 19(9), 524–533. doi: 10.1016/j.tics.2015.07.009
  • Harris, A. M., & Remington, R. W. (2017). Contextual cueing improves attentional guidance, even when guidance is supposedly optimal. Journal of Experimental Psychology: Human Perception and Performance, 43(5), 926–940. doi: 10.1037/xhp0000394
  • Hulleman, J., & Olivers, C. N. L. (2017). The impending demise of the item in visual search. Behavioral and Brain Sciences, 40, 1–69. doi: 10.1017/S0140525X15002794
  • Jiang, Y. V., & Chun, M. M. (2001). Selective attention modulates implicit learning. The Quarterly Journal of Experimental Psychology Section A : Human Experimental Psychology, 54, 1105–1124.
  • Jiang, Y. V., & Chun, M. M. (2003). Contextual cueing: Reciprocal influences between attention and implicit learning. In L. Jiménez (Ed.), Advances in Consciousness Research, Vol. 48. Attention and implicit learning (pp. 277–296). Amsterdam: John Benjamins Publishing Company. doi: 10.1075/aicr.48.15jia
  • Jiang, Y., & Leung, A. W. (2005). Implicit learning of ignored visual context. Psychonomic Bulletin & Review, 12(1), 100–106.
  • Kass, R. E., & Raftery, A. E. (1995). Bayes factors. Journal of the American Statistical Association, 90(430), 773–795.
  • Khosla, A., Raju, A. S., Torralba, A., & Oliva, A. (2015). Understanding and predicting image memorability at a large scale. International Conference on Computer Vision. doi: 10.1109/ICCV.2015.275
  • Kleiner, M., Brainard, D., & Pelli, D. (2007). “What’s new in Psychtoolbox-3?” Perception 36 ECVP Abstract Supplement.
  • Kunar, M. A., Flusberg, S., Horowitz, T. S., & Wolfe, J. M. (2007). Does contextual cuing guide the deployment of attention? Journal of Experimental Psychology: Human Perception and Performance, 33(4), 816–828. doi: 10.1037/0096-1523.33.4.816
  • Kunar, M. A., Flusberg, S. J., & Wolfe, J. M. (2008). Time to guide: Evidence for delayed attentional guidance in contextual cueing. Visual Cognition, 16(6), 804–825. doi: 10.1080/13506280701751224
  • Lavie, N. (2010). Attention, distraction, and cognitive control under load. Current Directions in Psychological Science, 19(3), 143–148. doi: 10.1177/0963721410370295
  • Lleras, A., Buetti, S., & Mordkoff, J. T. (2013). When do the effects of distractors provide a measure of distractibility? Psychology of Learning and Motivation (Vol. 59). Elsevier. doi: 10.1016/B978-0-12-407187-2.00007-1
  • Lleras, A., & Von Mühlenen, A. (2004). Spatial context and top-down strategies in visual search. Spatial Vision, 17(4–5), 465–482. doi: 10.1163/1568568041920113
  • Lleras, A., Wang, Z., Madison, A., & Buetti, S. (2019). Predicting search performance in heterogeneous scenes: Quantifying the impact of homogeneity effects in efficient search. Collabra, 5, 1–15.
  • Lleras, A., Wang, Z., Ng, G. J. P., Ballew, K., Xu, J., & Buetti, S. (in press). A target contrast signal theory of parallel processing in goal-directed search. Attention, Perception, & Psychophysics.
  • Madison, A. M., Lleras, A., & Buetti, S. (2018). The role of crowding in parallel search : Peripheral pooling is not responsible for logarithmic efficiency in parallel search. Attention, Perception, & Psychophysics, 80(2), 352–373. doi: 10.3758/s13414-017-1441-3
  • Morey, R. D., & Rouder, R. N. (2018). BayesFactor: Computation of Bayes Factors for Common Designs. R package version 0.9.12-4.2. Retrieved from https://CRAN.R-project.org/package=BayesFactor
  • Neider, M. B., & Zelinsky, G. J. (2008). Exploring set size effects in scenes: Identifying the objects of search. Visual Cognition, 16(1), 1–10. doi: 10.1080/13506280701381691
  • Ng, G. J. P., Buetti, S., Dolcos, S., Dolcos, F., & Lleras, A. (2019, September 26). Distractor rejection in parallel search tasks takes time but does not benefit from context repetition. Visual Cognition. doi:10.17605/OSF.IO/ZWXBH
  • Ng, G. J. P., Lleras, A., & Buetti, S. (2018). Fixed-target efficient search has logarithmic efficiency with and without eye movements, 1752–1762.
  • Palmer, J. (1995). Attention in visual search: Distinguishing four causes of a set-size effect. Current Directions in Psychological Science, 4(4), 118–123.
  • R Core Team. (2018). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from https://www.R-project.org/
  • Rosenbaum, G. M., & Jiang, Y. V. (2013). Interaction between scene-based and array-based contextual cueing. Attention, Perception, & Psychophysics, 75(5), 888–899. doi: 10.3758/s13414-013-0446-9
  • Rouder, J. N., Speckman, P. L., Sun, D., Morey, R. D., & Iverson, G. (2009). Bayesian t tests for accepting and rejecting the null hypothesis. Psychonomic Bulletin and Review, 16(2), 225–237. doi: 10.3758/PBR.16.2.225
  • Schlagbauer, B., Muller, H. J., Zehetleitner, M., & Geyer, T. (2012). Awareness in contextual cueing of visual search as measured with concurrent access- and phenomenal-consciousness tasks. Journal of Vision, 12(11), 25–25. doi: 10.1167/12.11.25
  • Sewell, D. K., Colagiuri, B., & Livesey, E. J. (2018). Response time modeling reveals multiple contextual cuing mechanisms. Psychonomic Bulletin and Review, 1644–1665. doi: 10.3758/s13423-017-1364-y
  • Smyth, A. C., & Shanks, D. R. (2008). Awareness in contextual cuing with extended and concurrent explicit tests. Memory & Cognition, 36(2), 403–415.
  • Teichner, W. H., & Krebs, M. J. (1974). Visual search for simple targets. Psychological Bulletin, 81(1), 15–28. doi: 10.1037/h0035449
  • Townsend, J. T., & Ashby, F. G. (1983). Stochastic modeling of elementary psychological processes. Cambridge: CUP Archive.
  • Treisman, A. M., & Gelade, G. (1980). A feature-integration theory of attention. Cognitive Psychology, 12, 97–136. doi: 10.1016/0010-0285(80)90005-5
  • Treisman, A. M., & Gormican, S. (1988). Feature analysis in early vision: Evidence from search asymmetries. Psychological Review, 95(1), 15–48. doi: 10.1037/0033-295X.95.1.15
  • Treisman, A. M., & Sato, S. (1990). Conjunction search revisited. Journal of Experimental Psychology: Human Perception and Performance, 16(3), 459–478.
  • Tseng, Y., & Li, C.-S. R. (2004). Oculomotor correlates of context-guided learning in visual search. Perception & Psychophysics, 66(8), 1363–1378.
  • Vadillo, M. a., Konstantinidis, E., & Shanks, D. R. (2016). Underpowered samples, false negatives, and unconscious learning. Psychonomic Bulletin & Review, 23(1), 87–102. doi: 10.3758/s13423-015-0892-6
  • Verghese, P. (2001). Visual search and attention: A signal detection theory approach. Neuron, 31(4), 523–535. doi:10.1016/S0896-6273(01)00392-0
  • Wang, Z., Buetti, S., & Lleras, A. (2017). Predicting search performance in heterogeneous visual search scenes with real-world objects. Collabra, 3(1), 6. doi: 10.1525/collabra.53
  • Wang, Z., Lleras, A., & Buetti, S. (2018). Parallel, exhaustive processing underlies logarithmic search functions: Visual search with cortical magnification. Psychonomic Bulletin and Review, 1–8. doi: 10.3758/s13423-018-1466-1
  • Wolfe, J. M. (2006). Guided search 4.0. Integrated Models of Cognitive Systems, 3, 99–120. doi: 10.1007/978-94-011-5698-1_30
  • Wolfe, J. M. (2007). Guided search 4.0: Current progress with a model of visual search. Integrated Models of Cognitive Systems, 99–120. doi: 10.1167/1.3.349
  • Wolfe, J. M., & Horowitz, T. S. (2004). What attributes guide the deployment of visual attention and how do they do it? Nature Reviews Neuroscience, 5(6), 495–501. doi: 10.1038/nrn1411
  • Zang, X., Jia, L., & Müller, H. J. (2015). Invariant spatial context is learned but not retrieved in gaze-contingent tunnel-view search. Journal of Experimental Psychology: Learning, Memory, and Cognition, 41(3), 807–819.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.