ABSTRACT
Real-world visual search targets are frequently imperfect perceptual matches to our internal templates. For example, a friend on different occasions will have different clothes, hairstyles, and accessories, but some of these may vary more than others. The ability to deal with template-to-target variability is important to visual search in natural environments, but we know relatively little about how this is handled by the attentional system. Here, we test the hypothesis that top-down attentional biases are sensitive to the variance of target features and prioritize less-variable dimensions. Subjects were shown target cues composed of coloured dots moving in a specific direction followed by a working memory probe (30%) or visual search display (70%). Critically, the target features in the visual search display differed from the cue, with one feature drawn from a narrow distribution (low-variance dimension), and the other sampled from a broader distribution (high-variance dimension). The results demonstrate that subjects used knowledge of the likely cue-to-target variance to set template precision and bias attentional selection. Our results suggest that observers are sensitive to the variance of feature dimensions within a target and use this information to weight mechanisms of attentional selection.
Acknowledgements
Support for this work was provided by T-32 EY015387 to P.W. and RO1MH113855-01 to J.J.G. We would like to thank Connor Allen, Henry Moore, April Lou, and Megnha Advani for assistance in data collection.
Data availability statement
These data are available on Open Science Framework (OSF) at this location: https://osf.io/ep9sa/.
Disclosure statement
No potential conflict of interest was reported by the authors.