7
Views
5
CrossRef citations to date
0
Altmetric
Original Article

Neural model of visual stereomatching: slant, transparency and clouds

, &
Pages 635-669 | Received 03 Sep 1996, Published online: 09 Jul 2009
 

Abstract

Stereomatching of oblique and transparent surfaces is described using a model of cortical binocular ‘tuned’ neurons selective for disparities of individual visual features and neurons selective for the position, depth and 3D orientation of local surface patches. The model is based on a simple set of learning rules. In the model, monocular neurons project excitatory connection pathways to binocular neurons at appropriate disparities. Binocular neurons project excitatory connection pathways to appropriately tuned ‘surface patch’ neurons. The surface patch neurons project reciprocal excitatory connection pathways to the binocular neurons. Anisotropic intralayer inhibitory connection pathways project between neurons with overlapping receptive fields. The model's responses to simulated stereo image pairs depicting a variety of oblique surfaces and transparently overlaid surfaces are presented. For all the surfaces, the model (i) assigns disparity matches and surface patch representations based on global surface coherence and uniqueness, (ii) permits coactivation of neurons representing multiple disparities within the same image location, (iii) represents oblique slanted and tilted surfaces directly, rather than approximating them with a series of frontoparallel steps, (iv) assigns disparities to a cloud of points at random depths, like human observers and unlike Prazdny's (1985) method, and (v) causes globally consistent matches to override greedy local matches. The model represents transparency, unlike the model of Marr and Poggio (1976), and it assigns unique disparities, unlike the model of Prazdny.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.