ABSTRACT
Perception can’t have disjunctive content. Whereas you can think that a box is blue or red, you can’t see a box as being blue or red. Based on this fact, I develop a new problem for the ambitious predictive processing theory, on which the brain is a machine for minimizing prediction error, which approximately implements Bayesian inference. I describe a simple case of updating a disjunctive belief given perceptual experience of one of the disjuncts, in which Bayesian inference and predictive coding pull in opposite directions, with the former implying that one’s confidence in the belief should increase, and the latter implying that it should decrease. Thus, predictive coding fails to approximately implement Bayesian inference across the interface between belief and perception.
Acknowledgements
I would like to thank Dan Ryder for many lengthy, illuminating, and insightful discussions of the ideas developed in this paper, as well as for encouragement and support. I’m also thankful to Ben Henke for very helpful discussions at crucial stages of this project. This work benefited from detailed and perceptive comments from Daniel Burnston, Arnon Cahen, Julia Haas, Arnon Keren, Nicolas Porot, Jonna Vance, and Petra Vetter.
I have presented earlier versions of this paper at the Centre for Philosophical Psychology at The University of Antwerp, at the Department of Cognitive Science at the University of Haifa, at the 10th meeting of the European Society for Analytic Philosophy (Utrecht), at the Department of Philosophy at the Hebrew University of Jerusalem, and at the Philosophy Colloquium of Tel-Hai Academic College.
I would also like to thank Yochai Ataria, Dan Baras, Jonathan Berg, Adam Bradley, Peter Brössel, Baruch Eitam, David Enoch, Hagit Hel-Or, Uri Hertz, Aviv Keren, Assaf Kron, Jason Leddington, Arnon Levy, Oded Na’aman, Bence Nanay, Ittay Nissan-Rozen, Hadas Okon-Singer, Eli Pitcovski, Gal Richter-Levin, Eva Schmidt, Oron Shagrir, Jerry Viera, Preston Werner, and Yaffa Yeshurun for helpful comments.
Disclosure Statement
No potential conflict of interest was reported by the author(s).
Notes
1 I am assuming that assigning probabilities to hypotheses in the PP framework is equivalent to assigning degrees of confidence to beliefs. But it is also possible to formulate the argument directly using hypotheses and not beliefs, see Section 4.6.
2 Since and
are mutually independent, one can use the restricted conjunction rule:
. Thus,
3 Assuming that the posterior probability of the proposition that there is a circle ahead drops to, say, , when looking and seeing only a triangle,
4 To be more precise, a (circle-related) prediction error signal would still be formed, because Rawa believes that there is a circle ahead, and this leads to prediction error. The mere formation of a circle-related prediction error signal is not by itself a problem. The problem, as explained in the rest of the section, is that a (circle-related) prediction error signal leads to lowering the confidence in the disjunctive belief that produced it. This highlights the fact that my argument presupposes that disjunctive beliefs are tested (via PC) against perceptual evidence, in addition to the fact that logically simpler, non-disjunctive beliefs (such as the belief that there is a circle ahead) are tested against perceptual evidence. I defend this claim in Section 4.7.
5 Here is an illustration: Assume that the prior probability that is ahead is
. The probability of having an experience as of
(‘
’) given that
is ahead is very high, say
. The probability of having
when there is no
ahead—i.e., an illusion as of
—is very low (the visual system is functioning normally in normal conditions), say
. So the prior probability
is equal to
. Plugged into Bayes’s rule:
.
6 I would like to thank an anonymous reviewer for suggesting this argument.
7 I would like to thank an anonymous reviewer for pointing this out.
8 Roughly (assuming that P(sqr) = 0.08):
9 Proposed by an anonymous reviewer.
10 I would like to thank an anonymous reviewer for raising this objection.
11 I would like to thank an anonymous reviewer for raising this objection.