109
Views
5
CrossRef citations to date
0
Altmetric
Articles

Artificial agency and the game of semantic extension

 

ABSTRACT

Artificial agents are commonly described by using words that traditionally belong to the semantic field of life. I call this phenomenon the game of semantic extension. However, the semantic extension of words as crucial as ‘autonomous’, ‘intelligent’, ‘creative’, ‘moral’, and so on, is often perceived as unsatisfactory, which is signalled with the extensive use of inverted commas or other syntactical cues. Such practice, in turn, has provoked harsh criticism that usually refers back to the literal meaning of the words to show their inappropriateness in describing artificial agents. Hence the question: how can we choose our words appropriately and wisely while making sense of artificial agents? This paper tries to answer by sketching the main features of the game of semantic extension in relation to artificial agency, reviewing the related opportunities and risks, and advancing some practical suggestions on how to play the game well.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 Many examples can be found, for instance, in the debate on trust and digital technologies: ‘e-trust’ in (Taddeo and Floridi Citation2011), ‘TRUST’ in (Grodzinsky, Miller, and Wolf Citation2011), ‘robotrust’ in (Pagallo Citation2010).

2 See, for example, (Coeckelbergh Citation2017b, 296): ‘if more people were to speak about what machines do in terms of ‘artistic creations’ and ‘works of art’, than would we really have an objective basis for saying that they are wrong? Even if today we might be opposed to the very idea of machine art, in the course of time, our language might change and let the machines in through the backdoor’.

3 In order to do so, I extensively draw on the book I just quoted that has already proven extremely useful in this inquiry, i.e. Wittgenstein’s Philosophical Investigations (Wittgenstein Citation1958). Other important observations on what it means to play a game and what this entails in relation to meaning can be read in Hans-Georg Gadamer’s Truth and Method (Gadamer Citation2004, 102–130).

4 For an example, see (Fossa Citation2017).

5 See, for plenty of examples, the use of proxies for measuring human virtues such as loyalty or dependability and the connected epistemological and ethical issues brought up by (O’Neill Citation2016). From the point of view of biology, see (Boldt Citation2018). This issue is also extremely visible in Wiener’s theoretical writings on cybernetics.

6 For an example, see (Fossa Citation2018).

7 I tried to do this in (Fossa Citation2019).

Additional information

Notes on contributors

Fabio Fossa

Fabio Fossa (Ph.D., University of Pisa) is a researcher at the Department of Mechanical Engineering of the Politecnico of Milan, Italy. His main research areas are applied ethics, philosophy of technology, robot and AI ethics, and the philosophy of Hans Jonas. His current research deals with the philosophy of artificial agency and the ethics of autonomous driving.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.