227
Views
6
CrossRef citations to date
0
Altmetric
Regular articles

Nonblurred regions show priority for gaze direction over spatial blur

&
Pages 927-945 | Received 26 Mar 2012, Published online: 04 Oct 2012
 

Abstract

The human eye continuously forms images of our 3D environment using a finite and dynamically changing depth of focus. Since different objects in our environment reside at different depth planes, the resulting retinal images consist of both focused and spatially blurred objects concurrently. Here, we wanted to measure what effect such a mixed visual diet may have on the pattern of eye movements. For that, we have constructed composite stimuli, each containing an intact photograph and several progressively blurred versions of it, all arranged in a 3 × 3 square array and presented simultaneously as a single image. We have measured eye movements for 7 such composite stimuli as well as for their corresponding root mean square (RMS) contrast-equated versions to control for any potential contrast variations as a result of the blurring. We have found that when observers are presented with such arrays of blurred and nonblurred images they fixate significantly more frequently on the stimulus regions that had little or no blur at all (p < .001). A similar pattern of fixations was found for the RMS contrast-equated versions of the stimuli indicating that the observed distributions of fixations is not simply the result of variations in image contrasts due to spatial blurring. Further analysis revealed that, during each 5 second presentation, the image regions containing little or no spatial blur were fixated first while other regions with larger amounts of blur were fixated later, if fixated at all. The results contribute to the increasing list of stimulus parameters that affect patterns of eye movements during scene perception.

Acknowledgments

This research was supported by a Biotechnology and Biological Sciences Research Council (BBSRC) quota studentship (W.S.S.) and a Wellcome Trust, JIF Grant 058711 (Y.T.). Sincere thanks to Robert Martin for developing the software for processing the eye movements. We thank the reviewers for the very helpful comments and suggestions.

Log in via your institution

Log in to Taylor & Francis Online

There are no offers available at the current time.

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.