176
Views
0
CrossRef citations to date
0
Altmetric
Articles

Efficient tuning of attention to narrow and broad ranges of task-relevant feature values

ORCID Icon & ORCID Icon
Pages 63-84 | Received 23 Sep 2022, Accepted 15 Mar 2023, Published online: 28 Mar 2023
 

ABSTRACT

Feature-based attention is the ability to select relevant information based on visual features, such as a particular colour or motion direction. In contrast to spatial attention, where the attentional focus has been shown to be flexibly adjustable to select small or large regions in space, it is unclear whether feature-based attention can be efficiently tuned to different feature ranges. Here, we establish that the focus of feature-based attention can be adjusted more broadly or narrowly to select currently relevant features. Participants attended to a set of target-coloured dots among distractor dots to detect brief decreases in luminance (Experiments 1a, 1b, 2) or bursts of coherent motion (Experiments 3a, 3b, 4), while varying the range of colours that the target dot spanned across trials. We found that while participants’ performance decreased with larger feature ranges to select, but remained at a relatively high level even at the largest colour range. Our findings suggest that broadening the focus of feature-based attention comes only at a small cost and that selecting large swaths of feature space is surprisingly efficient. These results are consistent with accounts that propose a flexible and generalized set of attentional mechanisms that act across both spatial and feature-based domains.

Acknowledgements

The authors thank Audrey Barszcz, Youngjin Choi, Lora Hsu, and Ashley Williams for assistance with data collection. Both authors contributed to the study concept and design. Data collection and analysis was conducted by AFC under supervision of VSS. AFC drafted the manuscript and VSS provided critical revisions. Both authors approved the final version of the manuscript for submission.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

Data and analysis scripts are available on OSF at https://osf.io/uf4k6/

Additional information

Funding

This research was supported by a grant from the National Science Foundation (BCS-1850738).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.