Publication Cover
Journal of Media Ethics
Exploring Questions of Media Morality
Volume 33, 2018 - Issue 3
671
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

Bots, Social Capital, and the Need for Civility

Pages 120-132 | Published online: 25 Jul 2018
 

ABSTRACT

Politicians, hate groups, counterpublics of science, and even socially-minded critics use bots to pad their numbers, spread information, and engage in social critique. This article pursues the ethics of bots beyond the automated or not question that dominates the literature and offers the concept of bot civility. Machinic and social bot strategies are discussed with regard for the manufacture of social capital—bot incivility. The analysis suggests that bots, which do not trick persons into thinking they are human, are not necessarily morally neutral and that some bots that emulate persons can actually be ethically sensible.

Notes

1. The phrase other side of the wall is an allusion to Alan Turing’s (Citation1950) famous imitation game. To play the imitation game, a person is placed in a room and told to communicate with two other entities just on the other side of a wall. The person is then told that one entity is another flesh and blood human and, the other, a computer program designed to “speak” like people do. The person is only allowed to communicate with the other two entities via individual text exchanges. According to Turing, if the person cannot distinguish the computer from the human, the computer, for all intents and purposes, can be said to demonstrate thinking. Turing was writing in the mid-1900s—the nascent stages of computer science development—and so his main purpose in posing the scenario was to draw out the potentials of artificial intelligence in a practicable sense—what machines can do in contrast to what people actually do. I am not as concerned with the actual abilities of machine learning and artificial intelligence as much as I am with the use of such technologies to fabricate social capital by faking actual people and transgressing ethical modes of communicative engagement via media. Compelling cases can be made for the moral agency and patiency of automated systems (e.g., see Floridi & Sanders, Citation2004; Siponen, Citation2004; Stahl, Citation2004), which further reveal complications of human (and bot) moral responsibility and duty (e.g., Gunkel, Citation2012, Citation2017). In this article, however, I am interested in the ethics of humans, who have commissioned the help of bots to act on their behalf.

2. See Dale (Citation2016) for a helpful overview of current bot technologies.

3. This is for the sake of argument and in spite of the fact that Instagram’s terms of use do not actually allow for this type of automation. Namely, numbers five and twenty-two of the general terms of the Platform Policy. Number five: “Do not store or cache Instagram login credentials” (“Platform Policy,” 2018). Number twenty-two: “Ensure your comments are uniquely tailored for each person. Don’t post unauthorized commercial communication or spam on Instagram” (Platform Policy,” 2018). I am operating under the assumption that the policies of a private company, which generates revenue with its own automated push marketing, might be necessary for that company’s business model, but does not equate to being ethical as such.

4. Lee (Citation2010) gives an insightful history of the phenomenon of astroturfing. From her article, one realizes that the act of disingenuously padding one’s numbers has been around a very long time, taking the form of over-representations of numbers in website imagery, or even pamphlets relentlessly printed and circulated by only a few people implying more than their actual numbers. Using a botnet allows someone to do these sorts of things more easily.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 386.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.