103
Views
0
CrossRef citations to date
0
Altmetric
Articles

Perception and Disjunctive Belief: A New Problem for Ambitious Predictive Processing

Pages 449-464 | Received 24 Aug 2021, Accepted 20 Nov 2022, Published online: 07 Sep 2023
 

ABSTRACT

Perception can’t have disjunctive content. Whereas you can think that a box is blue or red, you can’t see a box as being blue or red. Based on this fact, I develop a new problem for the ambitious predictive processing theory, on which the brain is a machine for minimizing prediction error, which approximately implements Bayesian inference. I describe a simple case of updating a disjunctive belief given perceptual experience of one of the disjuncts, in which Bayesian inference and predictive coding pull in opposite directions, with the former implying that one’s confidence in the belief should increase, and the latter implying that it should decrease. Thus, predictive coding fails to approximately implement Bayesian inference across the interface between belief and perception.

Acknowledgements

I would like to thank Dan Ryder for many lengthy, illuminating, and insightful discussions of the ideas developed in this paper, as well as for encouragement and support. I’m also thankful to Ben Henke for very helpful discussions at crucial stages of this project. This work benefited from detailed and perceptive comments from Daniel Burnston, Arnon Cahen, Julia Haas, Arnon Keren, Nicolas Porot, Jonna Vance, and Petra Vetter.

I have presented earlier versions of this paper at the Centre for Philosophical Psychology at The University of Antwerp, at the Department of Cognitive Science at the University of Haifa, at the 10th meeting of the European Society for Analytic Philosophy (Utrecht), at the Department of Philosophy at the Hebrew University of Jerusalem, and at the Philosophy Colloquium of Tel-Hai Academic College.

I would also like to thank Yochai Ataria, Dan Baras, Jonathan Berg, Adam Bradley, Peter Brössel, Baruch Eitam, David Enoch, Hagit Hel-Or, Uri Hertz, Aviv Keren, Assaf Kron, Jason Leddington, Arnon Levy, Oded Na’aman, Bence Nanay, Ittay Nissan-Rozen, Hadas Okon-Singer, Eli Pitcovski, Gal Richter-Levin, Eva Schmidt, Oron Shagrir, Jerry Viera, Preston Werner, and Yaffa Yeshurun for helpful comments.

Disclosure Statement

No potential conflict of interest was reported by the author(s).

Notes

1 I am assuming that assigning probabilities to hypotheses in the PP framework is equivalent to assigning degrees of confidence to beliefs. But it is also possible to formulate the argument directly using hypotheses and not beliefs, see Section 4.6.

2 Since A andB are mutually independent, one can use the restricted conjunction rule:P(AB)=P(A)P(B). Thus, P(AB)=P(A)+P(B)P(AB)=P(A)+P(B)P(A)P(B)=0.8+0.60.8×0.6=0.92.

3 Assuming that the posterior probability of the proposition that there is a circle ahead drops to, say, 0.001, when looking and seeing only a triangle, P(AB)=0.001+0.99990.001×0.9999=0.9999001.

4 To be more precise, a (circle-related) prediction error signal would still be formed, because Rawa believes that there is a circle ahead, and this leads to prediction error. The mere formation of a circle-related prediction error signal is not by itself a problem. The problem, as explained in the rest of the section, is that a (circle-related) prediction error signal leads to lowering the confidence in the disjunctive belief that produced it. This highlights the fact that my argument presupposes that disjunctive beliefs are tested (via PC) against perceptual evidence, in addition to the fact that logically simpler, non-disjunctive beliefs (such as the belief that there is a circle ahead) are tested against perceptual evidence. I defend this claim in Section 4.7.

5 Here is an illustration: Assume that the prior probability that C is ahead is 0.3. The probability of having an experience as of C (‘Ec’) given that C is ahead is very high, say 0.999. The probability of having Ec when there is no C ahead—i.e., an illusion as of C—is very low (the visual system is functioning normally in normal conditions), say 0.00001. So the prior probability P(Ec) is equal to 0.999×0.3+0.00001×0.7=0.3. Plugged into Bayes’s rule: P(ABEc)=P(Ec|AB)×P(AB)P(Ec)=0.00001×0.60.3=0.00002.

6 I would like to thank an anonymous reviewer for suggesting this argument.

7 I would like to thank an anonymous reviewer for pointing this out.

8 Roughly (assuming that P(sqr) = 0.08): P(tricirEsqr)=P(Esqrtricir)×P(tricir)P(Esqr)=0.00001×0.920.00001×0.92+0.999×0.08=0.00015.

9 Proposed by an anonymous reviewer.

10 I would like to thank an anonymous reviewer for raising this objection.

11 I would like to thank an anonymous reviewer for raising this objection.

Additional information

Funding

This research was supported by an ISF grant no. 715/20 and also by a grant from the John Templeton Foundation (Prime Award no. 48365) as part of the Summer Seminars in Neuroscience and Philosophy (SSNAP, subcontract no. 283-2608).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.