1,129
Views
0
CrossRef citations to date
0
Altmetric
Open Peer Commentaries

Responsibility, Authenticity and the Self in the Case of Symbiotic Technology

ORCID Icon &
This article refers to:
Mapping the Dimensions of Agency

The target article (Schönau et al. Citation2021) recognizes four key ethical dimensions, or values, that are affected by neurotechnology, and proposes that the notion of agency can provide a unifying conceptual framework to map and navigate them. We praise the proposal and the range of insights that it enables. In the following, we aim to show two things. First, that one of the ethical dimensions, “responsibility,” should be further refined to make sense of the cases presented by the authors. Second, that such refinement can use conceptual components that are already in the framework, i.e. the dimension of “authenticity,” potentially indicating how the four ethical dimensions might be further intertwined with each other. This, we believe, opens up avenues for further research and development of the proposed framework.

The authors of the target article claim that “the dimension of responsibility is linked to the agential competency of exercising control.” They propose to “understand the issue of responsibility ascription primarily as a problem of intentional control.” “The more an agent,” they continue, “is able to perform intentional action within [a BCI system’s] control-loop, the more responsible he is for the outcome of that action. Conversely, if a malfunction results in the agent falling out of his control loop, responsibility ascription is gradually lost” (Schönau et al. Citation2021, 174). Later in the text, while discussing the case of a BCI user, the authors mention that a possible concern regards the confabulatory appropriation of neurotechnology mediated actions that were not originally intended, and for which reduced responsibility should therefore be attributed (Schönau et al. Citation2021, 174).

The case the authors present might partially reveal some limits in the account of intentional action and responsibility adopted in the framework. In fact, the case seems to suggest that, despite the presence of full intentional endorsement of an action (be it confabulatory or not), an agent can be (partially) discharged of their responsibility, even and especially when the BCI functions within their normal operational design domain. This is in contrast with the account proposed by the authors, that seems to rely on a simple direct connection between intentional action and responsibility attribution. We think that such notion of intentional action cannot fully, or most elegantly, account for case scenarios like the one the authors present. We also believe that the case presented is only a mild, not fully representative, example of what certain BCIs can do. In fact, as we will briefly explain, certain BCIs might elicit more than post-hoc intentional endorsement (what the authors call “agential confabulations”), and actually radically create propter-hoc intentions. This has deeper consequences for moral responsibility as it questions the nature of an agent’s authentic self and its self-generated, intentional actions.

Krol, Haselager, and Zander (Citation2020; see also a forthcoming paper (Haselager, Mecacci, and Wolkenstein Citation2021), show that symbiotic technology, and in particular passive BCI, can stress the notion of intentional action and reveal how it might be philosophically and ethically problematic to base responsibility attribution on it. Symbiotic technology collects and utilizes a user’s brain activity without requiring an active effort from them. Zander et al. (Citation2016) presented a compelling example of this technology: based on real-time analysis of brain activity and contextual cues, their system analyzed brain activity and established “a user model from which the participants’ intentions could be derived” (Haselager, Mecacci, and Wolkenstein Citation2021). Information is extracted and interpreted by the machine to assist them with information or action, all without a subject being necessarily aware of any part of such process. Actions that result from the symbiotic union of user and BCI, hardly discernible by design, might be fully endorsed by the user as “their own” intentional actions. The intentions that underlie those actions, however, rather than being a simple afterthought, seem to be more correctly ascribable to an “emergent self” that is the product of the human-machine symbiosis. For instance, a passive BCI might pick up on different subconscious neuronal processes that are involved in the subconscious simultaneous preparation of different courses of action. Following Dennett’s (Citation1991) ‘multiple drafts’ model, many different information processing streams take place in parallel, regarding aspects of perception, cognition, action, etc. Only a few of these drafts are destined to produce behavior, while most of them are lost and never surface to an agent’s awareness. From that perspective, symbiotic technology might be a bit like “eavesdropping” in a government’s planning bureau, picking out elements of information processing that might not have been acted out, were it not for the technology to force that particular information out in the open (Haselager, Mecacci, and Wolkenstein Citation2021). Simply put, the technology might lead to the awareness or performance of action plans that would otherwise have been discarded subconsciously. The problem with this is not simply that it might result in the post-hoc confabulatory endorsement of intentions from a BCI user, as also mentioned in the target paper. It is rather and perhaps more importantly the propter-hoc creation of a potentially distorted, inauthentic symbiotic self, with their more or less recognizable intentions to act. This has important consequences for responsibility, and especially for the account endorsed by the authors of the target paper. Symbiotic technology users might in fact experience a perfect synergy and feeling of control/agency within a BCI control loop, while still displaying potentially inauthentic intentional control, and therefore might be partially discharged of their moral responsibility. Symbiotic technology shifts the question from “who did what” to “who is what.” In this sense, we see how the dimension of authenticity, from within the authors framework, is fundamentally entangled to that of responsibility when making sense of certain BCI case scenarios. The extent to which the intentions and moral reasons produced by a certain user can be attributed to their “true self” can become a question of authenticity, in turn affecting the meaningful attribution of moral responsibility. Given the potential of multiple neurotechnologies—not just DBS—to influence the self, it might be worth considering to ground the dimension of responsibility on that of authenticity, rather than juxtaposing them. This might make the framework more encompassing, because it makes sense of a larger array of cases. It also might make the framework more powerful, because it can provide better insights into the root causes of those responsibility gaps that are generated by the introduction of symbiotic devices, which fade the borders between users and technology away.

REFERENCES

  • Dennett, D. C. 1991. Consciousness explained. New York, NY: Little, Brown & Co.
  • Haselager, W. F. G., G. Mecacci, and A. Wolkenstein. 2021. Can BCIs enlighten the concept of agency? A plea for an experimental philosophy of neurotechnology. In Friedrich O., Wolkenstein A., Bublitz C., Jox R.J., Racine E. (eds) Clinical Neurotechnology meets Artificial Intelligence. Advances in Neuroethics 55–68. Cham, Switzerland: Springer.
  • Krol, L. R., W. F. G. Haselager, and T. O. Zander. 2020. Cognitive and affective probing: A tutorial and review of active learning for neuroadaptive technology. Journal of Neural Engineering 17 (1):012001. doi:10.1088/1741-2552/ab5bb5.
  • Schönau, A., I. Dasgupta, T. Brown, E. Versalovic, E. Klein, and S. Goering. 2021. Mapping the dimensions of agency. AJOB Neuroscience 12 (2):172–186.
  • Zander, T. O., L. R. Krol, N. P. Birbaumer, and K. Gramann. 2016. Neuroadaptive technology enables implicit cursor control based on medial prefrontal cortex activity. Proceedings of the National Academy of Sciences of the United States of America 113 (52):14898–903. doi:10.1073/pnas.1605155114.