3,754
Views
3
CrossRef citations to date
0
Altmetric
Commentaries

Self-modeling epistemic spaces and the contraction principle

ORCID Icon
Pages 197-201 | Received 19 Nov 2019, Accepted 06 Feb 2020, Published online: 09 Mar 2020

1. Strong points: The contribution AST makes

What Graziano and colleagues describe as the “attention schema” really is one special case of what I have called the “phenomenal model of the intentionality relation” (PMIR) since 1993 (Metzinger, Citation2003; chap. 6; Metzinger, Citation1993, Citation2000, Citation2005, Citation2006; Metzinger & Gallese, Citation2003). For example, we also find PMIRs in motor selection and consciously experienced volition (“the self in the act of selecting an action goal”; cf. Metzinger, Citation2006) as well as in high-level cognitive self-control (“the thinking self in the act of grasping a concept”; cf. Metzinger, Citation2005). My central point was that the classical intentionality relation (as introduced by Franz Brentano in 1874) can itself form the content of a conscious mental representation, because human brains are intentionality-modelling systems. If you will, we have and constantly use “intentionality schemata” (and not only an “attention schema”), and on the level of conscious processing such models generate what we later call the first-person perspective—a representation of the whole organism as directed at a perceptual target, at abstract symbolic content, or a goal state. In beings like us, there sometimes exists a phenomenal model of the intentionality relation itself. We have, as it were, the capacity to “catch ourselves in the act”: At times we have higher-order conscious representations of ourselves as representing. This can, for example, create a model of the self in the act of attending and of the ability to control the focus of attention. I have called this special case “attentional agency” elsewhere (e.g., Metzinger, Citation2003, Citation2006, Citation2013, Citation2017).

Graziano and colleagues importantly point out how such models of subject/object relations form a potential bridge for social cognition and mind reading (Graziano, Guterstam, Bio, & Wilterson, Citation2019, p. 10). I fully agree with this. Attention schemata can be shared, dynamically coupled, and coordinated among agents, and sometimes this very process can itself be consciously experienced. In Metzinger, Citation2005 (figure 4) I have depicted such a configuration, Metzinger, Citation2006 (p. 30, figure 2.5b) explains the same principle not for attention, but for volition: In other-agent modelling, we can also represent intentions and action goals in our social environment. Interestingly, it is empirically plausible that this process has unconscious functional precursors in the motor system, for example in embedded motor schemata and the coding of effector-target relations (Metzinger & Gallese, Citation2003). Evidence demonstrates that the brain models movements and action goals in terms of multimodal representations of organism-object-relations. One may therefore speculate that, under an evolutionary perspective, there may be common unconscious precursors interestingly linking Graziano’s attention schema and the PMIR to motor control and the embodied simulation of other agents (Gallese & Sinigaglia, Citation2011).

2. Non-dual awareness as a counterexample

What immediately falsifies AST is the existence of a well-documented class of phenomenal states lacking any form of attentional control or attention schema, so-called “non-dual states of awareness” (NDA). NDA-states are called “non-dual”, because they lack subject/object structure. They also lack time-representation, self-location in a spatial frame of reference, attentional focus or control, and an egoic self-model (i.e., any representation of an agent, a “knowing” or “attending” self, or an entity enduring over time). Graziano interestingly discusses something very close to NDA in Chapter 9 of his book Consciousness and the Social Brain, describing it as “awareness free floating, unattached to its subject or its object”. (Graziano, Citation2015) The phenomenon of non-dual awareness has been mostly reported and researched in the context of contemplative practice and is rapidly gaining attention (Dunne, Citation2011; Josipovic, Citation2019; Metzinger, Citation2020), and it also provides clear empirical evidence of awareness without attention (Graziano et al., Citation2019, p. 6). It can be described as a reflexively self-representing global experience of bare wakefulness or “empty cognizance” (for hypothetical neural correlates, see Josipovic, Citation2019; section 3; Metzinger, Citation2019; section 6; Winter et al., Citation2020).

Two other classes of phenomenal states presenting direct empirical counterexamples to AST are, first, all states of complete “ego-dissolution” (Metzinger & Millière, Citation2020), as for example caused by the administration of fast-acting serotonergic psychedelic drugs like N,N-dimethyltryptamine (DMT) and 5-methoxy-N,N-dimethyltryptamine (5-MeO-DMT) (Lebedev et al., Citation2015; Letheby, Citation2020; Millière, Citation2017; Millière, Carhart-Harris, Roseman, Trautwein, & Berkovich-Ohana, Citation2018; Nour & Carhart-Harris, Citation2017). Second, any phenomenological markers of attention or an attentional schema also seem to be absent in certain episodes of conscious experience which can occur during NREM-sleep (Windt, Nielsen, & Thompson, Citation2016) and which have provisionally been termed “lucid dreamless sleep” (Metzinger, Citation2019; Thompson, Citation2015; Windt, Citation2015b).

3. Consciousness as a predictive model not of attentional control, but of tonic alertness

AST describes complex configurations of consciousness. A standard model, however, would better be developed from a minimal model explanation (Batterman & Rice, Citation2014). The best current candidate for a truly minimal form of phenomenal experience (MPE; cf. Metzinger, Citation2020) is not a predictive model used in attentional control, but a predictive model of tonic alertness. To be conceptually precise, let us clearly distinguish between the physical, the functional, and the phenomenological levels of description.

“Arousal”:

  • A graded physical property of the human brain.

    • → A purely physical, but variable boundary condition determining the depth of cortical information processing available to the organism as a whole.

    • → A vital parameter that must be predictively controlled (e.g., to generate the sleep–wake cycle).

“Tonic alertness”:

  • A graded functional property determining the capacity for sustained attention.

    • → A causal function resulting from the successful control of arousal over longer periods of time, in the absence of an external cue.

    • → A functional property causally enabling cognitive capacities like orientation, executive control, attention, and epistemic agency on the mental level.

“Wakefulness”:

  • A graded phenomenal property which is sometimes introspectively accessible.

    • → A representation of tonic alertness.

    • → The phenomenal character of awareness as such.

    • → The primary dimension of phenomenal state space.

The specific phenomenology of “bare wakefulness” and “pure” consciousness without self-consciousness or an attentional first-person perspective is well documented (Metzinger, Citation2020). Therefore, MPE or minimal phenomenal experience is better conceptualized as an abstract Bayesian representation of an epistemic space than as a model of an epistemic agent operating in such a space. This would only be what Graziano has termed “m-consciousness” (see next section). What Graziano and colleagues call “i-consciousness” really is an internal model of the integrated epistemic space (an “ESM” hereafter) currently opened by a given system—and the as-yet-unpartitioned nature of this space is what partly explains the abstract and “gauzy” phenomenal character in the relevant experiences of “bare awareness” or “pure” consciousness (Graziano et al., Citation2019, p. 13). Call this the “ESM”-thesis:

(ESM): Minimal consciousness is an integrated inner representation of a space of possible knowledge states or epistemic policies, defined by a system’s set of epistemic capacities.

According to ESM, consciousness would then appear whenever an information-processing system has (a) created an internal epistemic space, and (b) activated a model of this very epistemic space within it (Metzinger, Citation2019, August 1; section 8.4.1; Metzinger, Citation2020). Attention is only one of the epistemic capacities that can be consciously modelled and predictively controlled once an organism has a stable representation of its own epistemic space. ESM is compatible with AST and global workspace theory, but likely not with HOT-approaches, because here we are dealing with a reflexive same-order representation of the global space itself (Graziano et al., Citation2019).

4. Bringing it all together: The contraction principle

How is the property of phenomenality, of simply “being conscious” as such represented in different representational media? From a third-person scientific perspective, under empirically grounded theories of consciousness like those of Graziano and colleagues, it clearly is a property of a complex, internal model in the brain: What is conscious simply is a part of an organism’s model of the world, a specific processing layer in its internal model of reality—which typically also includes the organism itself and other agents in the world. “Phenomenality” is a property of an integrated, global state. This is a theory, and we know that it is a theory.

From the organism’s inner perspective, things are very different. The model is transparent; therefore, the representational medium is invisible. It is not experienced as a world-model, but simply as the world. There is full immersion, plus the phenomenology of direct realism. In addition, according to the internal model not experienced as a model, it now is the organism that is conscious, and not the world. There is a conscious self, and it forms the origin of a first-person perspective. Importantly, there will frequently be other self-aware agents that are also conscious, able to control their own attention, their thoughts, and their bodily movements. From the organism’s perspective, there is a social frame of reference, and the property of “being conscious” may be instantiated at multiple locations in this environment.

For example, in the dream state, neurotypical human beings will always experience other dream characters as also being conscious (Revonsuo, Citation2006). If human beings become lucid (and thereby aware of the fact that they are currently dreaming), then they will even know that neither the dream characters nor the dream environment really exist—but they will never experience the dream world, the content of the overall model currently active in the sleeping brain, as “being conscious”. Interestingly, the illusion that other dream characters are conscious will typically persist even after one has become aware that one is currently dreaming–like a localized “lucidity lapse” (McNamara, McLaren, Kowalczyk, & Pace-Schott, Citation2007; Windt, Citation2015a, p. 432). The illusion that dream characters have a conscious life of their own illustrates a robust coding principle of the human brain: For the organism, “phenomenality” is a local property either of itself or of another creature. It is not something global, but a local feature, and it can occur in multiple agents at the same time.

Call this the “contraction principle”:

(CP) “Phenomenality” is a property of certain integrated, global brain states. The brains of neurotypical human beings misrepresent this objectively given property of phenomenality by contracting it into a transparent conscious self-model, which then forms the origin of a first-person perspective.

There are counterexamples to CP, for example the NDA-states mentioned in the second section of this contribution. In such non-dual states, we find conscious experience without contraction, phenomenality without an egoic self-model, and awareness without the perspective created by top-down superimposition of an abstract subject/object prior. Therefore, NDA-states may be a much better entry point for the formulation of a first standard model of consciousness. AST explains only the perspective and the emergence of an attentional agent.

In their work, Graziano and colleagues have made too many valuable contributions to consciousness science to list them all. Here is what I think is the most important one: The better understanding of our need for an internal model to be used in the predictive control of attention and social cognition. The “attention schema”, perhaps better viewed as a computational tool for optimizing precision expectations in the brain and thereby giving rise to a temporally thick epistemic agent model (Friston, Citation2018; Metzinger, Citation2017; section 2.5), may help us to understand the contraction principle on a more formal level. Why? It is the subjectively experienced “sense of effort” in attentional agency that lies at the heart of the epistemic agent model, and the attention schema may well be brain’s representational prototype in constructing the organism’s first-person perspective. This then is what Graziano and colleagues call the “ghost version of i-consciousness” (Graziano et al., Citation2019).

I want to thank two anonymous reviewers, Tom Clark, Wanja Wiese, Jennifer Windt, Brad Mahon, and Michael Graziano for a set of extremely helpful comments and their constructive criticism.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Batterman, R. W., & Rice, C. C. (2014). Minimal model explanations. Philosophy of Science, 81(3), 349–376. doi: 10.1086/676677
  • Dunne, J. (2011). Toward an understanding of non-dual mindfulness. Contemporary Buddhism, 12(1), 71–88. doi: 10.1080/14639947.2011.564820
  • Friston, K. (2018). Am I self-conscious? (Or does self-organization entail self-consciousness?). Frontiers in Psychology, 9, 579. doi: 10.3389/fpsyg.2018.00579
  • Gallese, V., & Sinigaglia, C. (2011). What is so special about embodied simulation? Trends in Cognitive Sciences, 15(11), 512–519. doi: 10.1016/j.tics.2011.09.003
  • Graziano, M. S. A. (2015). Consciousness and the social brain (Oxford University Press paperback). New York, Oxford: Oxford University Press.
  • Josipovic, Z. (2019). Nondual awareness: Consciousness-as-such as non-representational reflexivity. Progress in Brain Research, 244, 273–298. doi: 10.1016/bs.pbr.2018.10.021
  • Lebedev, A. V., Lövdén, M., Rosenthal, G., Feilding, A., Nutt, D. J., & Carhart-Harris, R. L. (2015). Finding the self by losing the self: Neural correlates of ego-dissolution under psilocybin. Human Brain Mapping, 36(8), 3137–3153. doi: 10.1002/hbm.22833
  • Letheby, C. (2020). Being for no-one: Psychedelic experience and minimal subjectivity. Philosophy and the Mind Sciences, 1(1), 5. doi:10.33735/phimisci.2020.I.47
  • McNamara, P., McLaren, D., Kowalczyk, S., & Pace-Schott, E. F. (2007). “Theory of mind” in REM and NREM dreams. In D. Barrett & P. McNamara (Eds.), The new science of dreaming: Volume 1: Biological aspects (Vol. 1, pp. 201–220). Westport, CT: Praeger Publishers/Greenwood Publishing Group.
  • Metzinger, T. (1993). Subjekt und Selbstmodell. Paderborn: Schöningh.
  • Metzinger, T. (2000). The *subjectivity* of subjective experience - A representationalist analysis of the first-person perspective. In T. Metzinger (Ed.), Neural correlates of consciousness: Empirical and conceptual questions (pp. 285–306). Cambridge, MA: MIT Press.
  • Metzinger, T. (2003). Being no one: The self-model theory of subjectivity. Cambridge, MA: MIT Press.
  • Metzinger, T. (2005). Précis: Being no one. PSYCHE, June (11). Retrieved from http://journalpsyche.org/files/0xaad5.pdf
  • Metzinger, T. (2006). Conscious volition and mental representation: Toward a more fine-grained analysis. In N. Sebanz & W. Prinz (Eds.), Disorders of volition (pp. 19–48). Cambridge: Mass: MIT Press.
  • Metzinger, T. (2013). The myth of cognitive agency: Subpersonal thinking as a cyclically recurring loss of mental autonomy. Frontiers in Psychology, 4, doi: 10.3389/fpsyg.2013.00931
  • Metzinger, T. (2017). The problem of mental action. In T. Metzinger & W. Wiese (Eds.), Philosophy and predictive processing. Frankfurt am Main: MIND Group. doi: 10.15502/9783958573208
  • Metzinger, T. (2019, August 1). Minimal phenomenal experience: The ARAS-model theory - Steps toward a minimal model of conscious experience as such. Retrieved from osf.io/kcgd5
  • Metzinger, T. (2020). Minimal phenomenal experience: Meditation, tonic alertness, and the phenomenology of “pure” consciousness. Philosophy and the Mind Sciences, 1(1), 7. doi:10.33735/phimisci.2020.I.46
  • Metzinger, T., & Gallese, V. (2003). The emergence of a shared action ontology: Building blocks for a theory. Consciousness and Cognition, 12(4), 549–571. doi: 10.1016/S1053-8100(03)00072-2
  • Metzinger, T., & Millière, R. (2020). Special Topic: Radical disruptions of self consciousness [Special issue]. Philosophy and the Mind Sciences, 1(1). doi:10.33735/phimisci.2020.I
  • Graziano, M. S. A., Guterstam, A., Bio, B. J., & Wilterson, A. I. (2019). Toward a standard model of consciousness: Reconciling the attention schema, global workspace, higher-order thought, and illusionist theories. Cognitive Neuropsychology. doi: 10.1080/02643294.2019.1670630
  • Millière, R. (2017). Looking for the self: Phenomenology, neurophysiology and philosophical significance of drug-induced ego dissolution. Frontiers in Human Neuroscience, 11, 245. doi: 10.3389/fnhum.2017.00245
  • Millière, R., Carhart-Harris, R. L., Roseman, L., Trautwein, F.-M., & Berkovich-Ohana, A. (2018). Psychedelics, meditation, and self-consciousness. Frontiers in Psychology, 9, 1475. doi: 10.3389/fpsyg.2018.01475
  • Nour, M. M., & Carhart-Harris, R. L. (2017). Psychedelics and the science of self-experience. The British Journal of Psychiatry: the Journal of Mental Science, 210(3), 177–179. doi: 10.1192/bjp.bp.116.194738
  • Revonsuo, A. (2006). Inner presence: Consciousness as a biological phenomenon. Cambridge, MA: MIT Press.
  • Thompson, E. (2015). Dreamless sleep, the embodied mind, and consciousness. In T. K. Metzinger & J. M. Windt (Eds.), Open mind. doi: 10.15502/9783958570351
  • Windt, J. M. (2015a). Dreaming: A conceptual framework for philosophy of mind and empirical research. Cambridge, MA: MIT Press.
  • Windt, J. M. (2015b). Just in time—dreamless sleep experience as pure subjective temporality. In T. K. Metzinger & J. M. Windt (Eds.), Open mind. doi: 10.15502/9783958571174
  • Windt, J. M., Nielsen, T., & Thompson, E. (2016). Does consciousness disappear in dreamless sleep? Trends in Cognitive Sciences, 20(12), 871–882. doi: 10.1016/j.tics.2016.09.006
  • Winter, U., LeVan, P., Borghardt, T. L., Akin, B., Wittmann, M., Leyens, Y., & Schmidt, S. (2020). Content-free awareness: EEG-fcMRI correlates of consciousness as such in an expert mediator. Frontiers in Psychology, 10. doi:10.3389/fpsyg.2019.03064