2,540
Views
1
CrossRef citations to date
0
Altmetric
The Court of Public Opinion

Algorithmic Aversion? Experimental Evidence on the Elasticity of Public Attitudes to “Killer Robots”

 

Abstract

Lethal autonomous weapon systems present a prominent yet controversial military innovation. While previous studies have indicated that the deployment of “killer robots” would face considerable public opposition, our understanding of the elasticity of these attitudes, contingent on different factors, remains limited. In this article, we aim to explore the sensitivity of public attitudes to three specific factors: concerns about the accident-prone nature of the technology, concerns about responsibility attribution for adverse outcomes, and concerns about the inherently undignified nature of automated killing. Our survey experiment with a large sample of Americans reveals that public attitudes toward autonomous weapons are significantly contingent on beliefs about their error-proneness relative to human-operated systems. Additionally, we find limited evidence that individuals concerned about human dignity violations are more likely to oppose “killer robots.” These findings hold significance for current policy debates about the international regulation of autonomous weapons.

Acknowledgements

We would like to express our gratitude to the editorial team, the anonymous reviewers, Anna Nadibaidze, Doreen Horschig, Halvard Buhaug, Lucas Tamayo Ruiz, Neil Renic, the participants of our 2023 ISA Annual Convention panel, as well as the attendees of research seminars at the Peace Research Center Prague and Institute for Peace Research and Security Policy at the University of Hamburg, for their valuable comments and suggestions on earlier drafts of this manuscript. We also gratefully acknowledge funding from the Charles University’s program PRIMUS/22/HUM/005 (Experimental Lab for International Security Studies – ELISS).

Disclosure Statement

No potential conflict of interest was reported by the author(s).

Data Availability Statement

The data and materials that support the findings of this study are available in the Harvard Dataverse at https://doi.org/10.7910/DVN/8PDOGJ.

Correction Statement

This article has been corrected with minor changes. These changes do not impact the academic content of the article.

Notes

1 For a discussion of the role of emerging technologies for international politics, see Michael C. Horowitz, “Do Emerging Military Technologies Matter for International Politics?” Annual Review of Political Science 23, no. 1 (May 2020): 386. On specific technologies, see, for example, Michael C. Horowitz, “Artificial Intelligence, International Competition, and the Balance of Power,” Texas National Security Review 1, no. 3 (May 2018): 36–57; Kenneth Payne, “Artificial Intelligence: A Revolution in Strategic Affairs?” Survival 60, no. 5 (September 2018): 7–32; Benjamin M. Jensen, Christopher Whyte, and Scott Cuomo, “Algorithms at War: The Promise, Peril, and Limits of Artificial Intelligence,” International Studies Review 22, no. 3 (September 2020): 526–50; Avi Goldfarb and Jon R. Lindsay, “Prediction and Judgment: Why Artificial Intelligence Increases the Importance of Humans in War,” International Security 46, no. 3 (February 2022): 7–50.

2 Michael C. Horowitz, “When Speed Kills: Lethal Autonomous Weapon Systems, Deterrence and Stability,” Journal of Strategic Studies 42, no. 6 (August 2019): 764–88.

3 See, for example, Elvira Rosert and Frank Sauer, “How (not) to Stop the Killer Robots: A Comparative Analysis of Humanitarian Disarmament Campaign Strategies,” Contemporary Security Policy 42, no. 1 (2021): 4–29; Ondrej Rosendorf, “Predictors of Support for a Ban on Killer Robots: Preventive Arms Control as an Anticipatory Response to Military Innovation,” Contemporary Security Policy 42, no. 1 (2021): 30–52.

4 Charli Carpenter, “How Do Americans Feel about Fully Autonomous Weapons?” Duck of Minerva, 19 June 2013, https://www.duckofminerva.com/2013/06/how-do-americans-feel-about-fully-autonomous-weapons.html; Kevin L. Young and Charli Carpenter, “Does Science Fiction Affect Political Fact? Yes and No: A Survey Experiment on ‘Killer Robots’,” International Studies Quarterly 62, no. 3 (August 2018): 562–76; Ipsos, “Global Survey Highlights Continued Opposition to Fully Autonomous Weapons,” 2 February 2021, https://www.ipsos.com/en-us/global-survey-highlights-continued-opposition-fully-autonomous-weapons; Ondrej Rosendorf, Michal Smetana, and Marek Vranka, “Autonomous Weapons and Ethical Judgments: Experimental Evidence on Attitudes toward the Military Use of ‘Killer Robots’,” Peace and Conflict 28, no. 2 (May 2022): 177–83.

5 Michael C. Horowitz, “Public Opinion and the Politics of the Killer Robots Debate,” Research & Politics 3, no. 1 (February 2016): 1–8.

6 See, for example, Ingvild Bode and Hendrik Huelss, “Autonomous Weapons Systems and Changing Norms in International Relations,” Review of International Studies 44, no. 3 (July 2018): 393–413; Rosert and Sauer, “How (not) to Stop the Killer Robots”; Rosendorf, “Predictors of Support for a Ban on Killer Robots”; Anna Nadibaidze, “Great Power Identity in Russia’s Position on Autonomous Weapons Systems,” Contemporary Security Policy 43, no. 3 (May 2022): 407–35.

7 See, for example, Daryl G. Press, Scott D. Sagan, and Benjamin A. Valentino, “Atomic Aversion: Experimental Evidence on Taboos, Traditions, and the Non-Use of Nuclear Weapons,” American Political Science Review 107, no. 1 (February 2013): 188–206; Scott D. Sagan and Benjamin A. Valentino, “Revisiting Hiroshima in Iran: What Americans Really Think about Using Nuclear Weapons and Killing Noncombatants,” International Security 42, no. 1 (July 2017): 41–79; Horowitz, “Public Opinion and the Politics of the Killer Robots Debate”; Janina Dill, Scott D. Sagan, and Benjamin A. Valentino, “Kettles of Hawks: Public Opinion on the Nuclear Taboo and Noncombatant Immunity in the United States, United Kingdom, France, and Israel,” Security Studies 31, no. 1 (February 2022): 1–31.

8 See also Horowitz, “Public Opinion and the Politics of the Killer Robots Debate”; Michael C. Horowitz and Sarah Maxey, “Morally Opposed? A Theory of Public Attitudes and Emerging Military Technologies,” unpublished manuscript, 28 May 2020, https://doi.org/10.2139/ssrn.3589503.

9 Horowitz, “Do Emerging Military Technologies Matter for International Politics?” 386.

10 Payne, “Artificial Intelligence,” 7–11. See also Jensen, Whyte, and Cuomo, “Algorithms at War”; Goldfarb and Lindsay, “Prediction and Judgment”; Antonio Calcara, Andrea Gilli, Mauro Gilli, Raffaele Marchetti, and Ivan Zaccagnini, “Why Drones Have not Revolutionized War: The Enduring Hider-Finder Competition in Air Warfare,” International Security 46, no. 4 (April 2022): 130–71.

11 See, for example, Horowitz, “Artificial Intelligence, International Competition, and the Balance of Power”; Payne, “Artificial Intelligence”; Katarzyna Zysk, “Defence Innovation and the 4th Industrial Revolution in Russia,” Journal of Strategic Studies 44, no. 4 (December 2020): 543–71; Elsa B. Kania, “Artificial Intelligence in China’s Revolution in Military Affairs,” Journal of Strategic Studies 44, no. 4. (May 2021): 515–42.

12 ICRC, Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons (Versoix: ICRC, 2016), 31.

13 Vincent Boulanin and Maaike Verbruggen, Mapping the Development of Autonomy in Weapon Systems (Solna: SIPRI, 2017), 8; Paul Scharre and Michael C. Horowitz, “An Introduction to Autonomy in Weapon Systems,” Center for a New American Security, 13 February 2015, 7, https://www.cnas.org/publications/reports/an-introduction-to-autonomy-in-weapon-systems.

14 Michael C. Horowitz, “Why Words Matter: The Real World Consequences of Defining Autonomous Weapons Systems,” Temple International and Comparative Law Journal 30, no. 1 (Spring 2016): 90–91; Frank Sauer, “Stepping Back from the Brink: Why Multilateral Regulation of Autonomy in Weapons Systems Is Difficult, Yet Imperative and Feasible,” International Review of the Red Cross 102, no. 913 (April 2020): 240–41.

15 Ronald C. Arkin, “The Case for Ethical Autonomy in Unmanned Systems,” Journal of Military Ethics 9, no. 4 (December 2010): 333–34; Jürgen Altmann and Frank Sauer, “Autonomous Weapon Systems and Strategic Stability,” Survival 59, no. 5 (September 2017): 119; Horowitz, “When Speed Kills,” 769–70.

16 Altmann and Sauer, “Autonomous Weapon Systems and Strategic Stability,” 128–32; Horowitz, “When Speed Kills,” 781–83.

17 Noel E. Sharkey, “The Evitability of Autonomous Robot Warfare,” International Review of the Red Cross 94, no. 866 (June 2012): 787–99; ICRC, “ICRC Position on Autonomous Weapon Systems,” background paper, 12 May 2021, https://www.icrc.org/en/document/icrc-position-autonomous-weapon-systems.

18 Peter Asaro, “On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making,” International Review of the Red Cross 94, no. 886 (June 2012): 708–09; Christof Heyns, “Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns,” Human Rights Council, 9 April 2013, https://digitallibrary.un.org/record/755741.

19 See, for example, Ingvild Bode, “Norm-Making and the Global South: Attempts to Regulate Lethal Autonomous Weapons Systems,” Global Policy 10, no. 3 (June 2019): 359–64; Rosert and Sauer, “How (not) to Stop the Killer Robots,” 18–21; Nadibaidze, “Great Power Identity in Russia’s Position on Autonomous Weapons Systems,” 416–18.

20 Charli Carpenter, “A Better Path to a Treaty Banning ‘Killer Robots’ Has Just Been Cleared,” World Politics Review, 7 January 2022, https://www.worldpoliticsreview.com/a-better-path-to-a-treaty-banning-ai-weapons-killer-robots/; Ousman Noor, “Russia Leads an Assault on Progress at UN Discussions, the CCW Has Failed,” Stop Killer Robots, 4 August 2022, https://www.stopkillerrobots.org/news/russia-leads-an-assault-on-progress-at-un-discussions-the-ccw-has-failed/.

21 Young and Carpenter, “Does Science Fiction Affect Political Fact?” 562; Horowitz and Maxey, “Morally Opposed?” 2; Rosendorf, Smetana, and Vranka, “Autonomous Weapons and Ethical Judgments,” 178.

22 Human Rights Watch & International Human Rights Clinic, “Losing Humanity: The Case against Killer Robots,” report, 19 November 2012, 35, https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots; UNIDIR, “The Weaponization of Increasingly Autonomous Technologies: Considering Ethics and Social Values,” UNIDIR Resources No. 3, 30 March 2015, 6, https://www.unidir.org/publication/weaponization-increasingly-autonomous-technologies-considering-ethics-and-social-values; ICRC, “Ethics and Autonomous Weapon Systems: An Ethical Basis for Human Control?” ICRC report, 3 April 2018, 5–6, https://www.icrc.org/en/document/ethics-and-autonomous-weapon-systems-ethical-basis-human-control.

23 Noah Castelo, Maarten W. Bos, and Donald R. Lehmann, “Task-Dependent Algorithm Aversion,” Journal of Marketing Research 56, no. 5 (July 2019): 809–25; William M. Grove, David H. Zald, Boyd S. Lebow, Beth E. Snitz, and Chad Nelson, “Clinical Versus Mechanical Prediction: A Meta-Analysis,” Psychological Assessment 12, no. 1 (March 2000): 19–30.

24 For a discussion of error, see Berkeley J. Dietvorst, Joseph P. Simmons, and Cade Massey, “Algorithm Aversion: People Erroneously Avoid Algorithms After Seeing Them Err,” Journal of Experimental Psychology: General 144, no. 1 (2015): 114–26. For a discussion of complexity, see Eric Bogert, Aaron Schecter, and Richard T. Watson, “Humans Rely More on Algorithms than Social Influence as a Task Becomes More Difficult,” Scientific Reports 11, no. 1 (April 2021): 1–9. On the type of task, see Castelo, Bos, and Lehmann, “Task-Dependent Algorithm Aversion.”

25 Chiara Longoni, Andrea Bonezzi, and Carey K. Morewedge, “Resistance to Medical Artificial Intelligence,” Journal of Consumer Research 46, no. 4 (December 2019): 629.

26 Carpenter, “How Do Americans Feel about Fully Autonomous Weapons?”

27 Ipsos, “Global Survey Highlights Continued Opposition to Fully Autonomous Weapons.”

28 For insights into the attitudes of researchers, see Baobao Zhang, Markus Anderljung, Lauren Kahn, Noemi Dreksler, Michael C. Horowitz, and Allan Dafoe, “Ethics and Governance of Artificial Intelligence: Evidence from a Survey of Machine Learning Researchers,” Journal of Artificial Intelligence Research 71 (August 2021): 591–666. For insights into the attitudes of local officials, see Michael C. Horowitz and Lauren Kahn, “What Influences Attitudes about Artificial Intelligence Adoption: Evidence from U.S. Local Officials,” PLoS ONE 16, no. 10 (October 2021): 1–20.

29 For a discussion of military effectiveness, see Michael C. Horowitz, “Public Opinion and the Politics of the Killer Robots Debate.” For a discussion of responsibility, see James I. Walsh, “Political Accountability and Autonomous Weapons,” Research & Politics 2, no. 4 (October 2015): 1–6; Rosendorf, Smetana, and Vranka, “Autonomous Weapons and Ethical Judgments.” On sci-fi literacy, see Young and Carpenter, “Does Science Fiction Affect Political Fact?”

30 Michael C. Horowitz, “Public Opinion and the Politics of the Killer Robots Debate.”

31 Duncan Purves, Ryan Jenkins, and Bradley J. Strawser, “Autonomous Machines, Moral Judgment, and Acting for the Right Reasons,” Ethical Theory and Moral Practice 18, no. 4 (January 2015): 851–72; Horowitz and Maxey, “Morally Opposed?”

32 Horowitz and Maxey, “Morally Opposed?” 3.

33 Paul Scharre, Army of None: Autonomous Weapons and the Future of War (New York: W. W. Norton & Company), chapter 9.

34 For a discussion of predictability and reliability, see UNIDIR, “The Weaponization of Increasingly Autonomous Technologies: Concerns, Characteristics and Definitional Approaches,” UNIDIR Resources No. 6, 9 November 2017, https://www.unidir.org/publication/weaponization-increasingly-autonomous-technologies-concerns-characteristics-and; Heather M. Roff and David Danks, “’Trust but Verify’: The Difficulty of Trusting Autonomous Weapons Systems,” Journal of Military Ethics 17, no. 1 (June 2018): 2–20. On “robocalyptic” imaginaries, see Young and Carpenter, “Does Science Fiction Affect Political Fact?”

35 Ipsos, “Global Survey Highlights Continued Opposition to Fully Autonomous Weapons.”

36 Robert R. Hoffman, Timothy M. Cullen, and John K. Hawley, “The Myths and Costs of Autonomous Weapon Systems,” Bulletin of the Atomic Scientists 72, no. 4 (June 2016): 248–49.

37 Daniele Amoroso and Guglielmo Tamburrini, “Toward a Normative Model of Meaningful Human Control over Weapons Systems,” Ethics & International Affairs 35, no. 2 (Summer 2021): 252–53.

38 Michael C. Horowitz, Lauren Kahn, and Laura Resnick Samotin, “A Force for the Future: A High-Reward, Low-Risk Approach to AI Military Innovation,” Foreign Affairs 101, no. 3 (April 2022): 162; Sauer, “Stepping Back from the Brink,” 249.

39 See, for example, Sharkey, “The Evitability of Autonomous Robot Warfare”; Heyns, “Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns”; Robert Sparrow, “Twenty Seconds to Comply: Autonomous Weapon Systems and the Recognition of Surrender,” International Law Studies 91 (October 2015): 699–728; Michael C. Horowitz, “The Ethics & Morality of Robotic Warfare: Assessing the Debate over Autonomous Weapons,” Daedalus 145, no. 4 (September 2016): 25–36; Elvira Rosert and Frank Sauer, “Prohibiting Autonomous Weapons: Put Human Dignity First,” Global Policy 10, no. 3 (July 2019): 370–75.

40 Marcus Schulzke, “Robots as Weapons in Just Wars,” Philosophy & Technology 24, no. 3 (April 2011): 300–01; Heyns, “Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns,” 13.

41 Garry Young, “On the Indignity of Killer Robots,” Ethics and Information Technology 23, no. 3 (April 2021): 473–74.

42 Ronald C. Arkin, “The Case for Ethical Autonomy in Unmanned Systems,” 333–34.

43 Dietvorst, Simmons, and Massey, “Algorithm Aversion.”

44 Janina Dill and Livia I. Schubiger, “Attitudes toward the Use of Force: Instrumental Imperatives, Moral Principles, and International Law,” American Journal of Political Science 65, no. 3 (June 2021): 612–33.

45 See, for example, Andreas Matthias, “The Responsibility Gap: Ascribing Responsibility for the Actions of Learning Automata,” Ethics and Information Technology 6, no. 3 (September 2004): 175–83; Robert Sparrow, “Killer Robots,” Journal of Applied Philosophy 24, no. 1 (March 2007): 62–77.

46 Asaro, “On Banning Autonomous Weapon Systems,” 693; Daniele Amoroso and Benedatta Giordano, “Who Is to Blame for Autonomous Weapons Systems’ Misdoings?” in Use and Misuse of New Technologies: Contemporary Challenges in International and European Law, eds. Elena Carpanelli and Nicole Lazzerini (Cham: Springer, 2019), 213–15.

47 See, for example, Sparrow, “Killer Robots.”

48 Michael N. Schmitt, “Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics,” Harvard National Security Journal 4 (February 2013): 33–34.

49 Lode Lauwaert, “Artificial Intelligence and Responsibility,” AI & Society 36, no. 3 (January 2021): 1004; Isaac Taylor, “Who Is Responsible for Killer Robots? Autonomous Weapons, Group Agency, and the Military-Industrial Complex,” Journal of Applied Philosophy 38, no. 2 (May 2021): 322.

50 Marcus Schulzke, “Autonomous Weapons and Distributed Responsibility,” Philosophy & Technology 26, no. 2 (June 2013): 204.

51 Schulzke, “Autonomous Weapons and Distributed Responsibility,” 204, 211; Michael Robillard, “No Such Thing as Killer Robots,” Journal of Applied Philosophy 35, no. 4 (November 2018): 709.

52 Ipsos, “Global Survey Highlights Continued Opposition to Fully Autonomous Weapons.”

53 See, for example, Heyns, “Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns”; Christof Heyns, “Human Rights and the Use of Autonomous Weapons Systems (AWS) During Domestic Law Enforcement,” Human Rights Quarterly 38, no. 2 (May 2016): 350–78; Amanda Sharkey, “Autonomous Weapons Systems, Killer Robots and Human Dignity,” Ethics and Information Technology 21, no. 2 (June 2019): 75–87; Rosert and Sauer, “Prohibiting Autonomous Weapons.”

54 Heyns, “Human Rights and the Use of Autonomous Weapons Systems (AWS) During Domestic Law Enforcement,” 11.

55 Rosert and Sauer, “Prohibiting Autonomous Weapons,” 372.

56 ICRC, “ICRC Position on Autonomous Weapon Systems,” 8.

57 Sauer, “Stepping Back from the Brink,” 254–55.

58 Christof Heyns, “Autonomous Weapons in Armed Conflict and the Right to a Dignified Life: An African Perspective,” South African Journal on Human Rights 33, no. 1 (February 2017): 49.

59 Asaro, “On Banning Autonomous Weapon Systems,” 708–09; Heyns, “Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns,” 17.

60 Sharkey, “Autonomous Weapons Systems, Killer Robots and Human Dignity,” 79–80.

61 Schmitt, “Autonomous Weapon Systems and International Humanitarian Law”; ICRC, “Ethics and Autonomous Weapon Systems,” 11.

62 Dieter Birnbacher, “Are Autonomous Weapons Systems a Threat to Human Dignity?” in Autonomous Weapons Systems: Law, Ethics, Policy, eds. Nehal C. Bhuta, Susanne Beck, Robin Geiβ, Hin-Yan Liu, and Claus Kreβ, (Cambridge: Cambridge University Press, 2016), 120; Horowitz, “The Ethics & Morality of Robotic Warfare,” 32.

63 Deane-Peter Baker, “The Awkwardness of the Dignity Objection to Autonomous Weapons,” Strategy Bridge, 6 December 2018, https://thestrategybridge.org/the-bridge/2018/12/6/the-awkwardness-of-the-dignity-objection-to-autonomous-weapons; Sharkey, “Autonomous Weapons Systems, Killer Robots and Human Dignity,” 79–80.

64 Ipsos, “Global Survey Highlights Continued Opposition to Fully Autonomous Weapons.”

65 See Appendix 1 for a full description of the scenario and all survey items. A detailed discussion of the ethical considerations associated with this study is provided in Appendix 2.

66 We acknowledge that asking about drone strikes approval could make some respondents believe that supporting remote-controlled strikes over LAWS is the socially desirable answer. However, the type of drone in this question was left unspecified, and participants were informed about the commander’s choice between the remote-controlled and autonomous drone on the same page. Furthermore, evidence from existing meta-studies indicates that online surveys are less susceptible to social desirability bias than in-person or telephone interviews. See Marcella K. Jones, Liviana Calzavara, Dan Allman, Catherine A. Worthington, Mark Tyndall, and James Iveniuk, “A Comparison of Web and Telephone Responses From a National HIV and AIDS Survey,” JMIR Public Health Surveill 2, no. 2 (July 2016).

67 ICRC, Autonomous Weapon Systems, 31; Boulanin and Verbruggen, Mapping the Development of Autonomy in Weapon Systems, 8.

68 Bureau of Investigative Journalism, “Drone Warfare,” https://www.thebureauinvestigates.com/projects/drone-war.

69 Our respondents also had the option to choose “neither.” However, participants who selected this option were subsequently forced to choose one of the drone options. Appendix 15 contains the results of an alternative analysis using an ordinal dependent variable with the intermediate “neither” category.

70 Press, Sagan, and Valentino, “Atomic Aversion”; Brian C. Rathbun and Rachel Stein, “Greater Goods: Morality and Attitudes toward the Use of Nuclear Weapons,” Journal of Conflict Resolution 64, no. 5 (May 2020): 787–816; Dill, Sagan, and Valentino, “Kettles of Hawks”; Michal Smetana and Michal Onderco, “From Moscow With a Mushroom Cloud? Russian Public Attitudes to the Use of Nuclear Weapons in a Conflict With NATO,” Journal of Conflict Resolution 67, no. 2–3 (February–March 2023): 183–209.

71 Following Aronow, Baron, and Pinson, we did not exclude the participants who failed the manipulation check. See Peter M. Aronow, Jonathan Baron, and Lauren Pinson, “A Note on Dropping Experimental Subjects who Fail a Manipulation Check,” Political Analysis 27, no. 4 (May 2019): 572–89. The results of the analysis after excluding those participants are in Appendix 7, 9, and 11.

72 See supplemental materials for Horowitz, “Public Opinion and the Politics of the Killer Robots Debate.”

73 Charli Carpenter, Alexander H. Montgomery, and Alexandria Nylen, “Braking Bad? How Survey Experiments Prime Americans for War Crimes,” Perspectives on Politics 19, no. 3 (September 2021): 912–24.

74 On Prolific as a survey tool, see Eyal Peer, Laura Brandimarte, Sonam Samat, and Alessandro Acquisti, “Beyond the Turk: Alternative Platforms for Crowdsourcing Behavioral Research,” Journal of Experimental Social Psychology 70 (May 2017): 153–63; Stefan Palan and Christian Schitter, “Prolific.ac—A Subject Pool for Online Experiments,” Journal of Behavioral and Experimental Finance 17 (March 2018): 22–27.

75 See, for example, Peer, Brandimarte, Samat, and Acquisti, “Beyond the Turk.”

76 In our sample, 63% of participants held a university degree, compared to 48% reported by the United States Census Bureau in February 2022. See United States Census Bureau, “Census Bureau Releases New Educational Attainment Data,” 24 February 2022, https://www.census.gov/newsroom/press-releases/2022/educational-attainment.html. Additionally, the distribution of party affiliation in our sample was 33% Republicans, 34% Democrats, and 33% Independents, compared to the Gallup poll from February 2023, which reported 27% Republicans, 28% Democrats, and 44% Independents. See Gallup, “Party Affiliation,” 1–23 February 2023, https://news.gallup.com/poll/15370/party-affiliation.aspx. The results of the analysis with sampling weights appear in Appendix 7, 9, and 11. Appendix 6 provides the descriptive statistics of our sample.

77 See Appendix 1 for a full description of the analogous scenario and all survey items.

78 See Appendix 3 for all survey items.

79 See Appendix 4 for all survey items. The follow-up survey on perceived differences between remote controlled and autonomous drones was preregistered using the Open Science Framework (see Appendix 5).

80 In Appendix 7, we show that these results are robust to the inclusion of controls and the use of different estimation techniques, but not to the exclusion of participants who failed the manipulation check or the use of sampling weights.

81 t(299) = –12.8, p < 0.001. See Appendix 8.

82 OR = 1.036, p < 0.001. See Appendix 8.

83 OR = 1.219, p = 0.484. In Appendix 9, we show that these findings are robust to the inclusion of controls, the exclusion of participants who failed the manipulation check, and the use of alternative estimation techniques and sampling weights. Since the experimental part of our additional survey on the perceived differences included the two conditions from the main experiment, we were able to repeat the analysis with a larger sample. In Appendix 10, we show that the null finding holds.

84 Results of a paired t-test indicate that there was no statistically significant difference between the measurement of legal accountability and moral responsibility (t(1036) = 0.669, p = 0.504).

85 Only 8 out of 216 respondents in the control group mentioned this type of concern.

86 Since the experimental part of this survey included the “equal risk + responsibility” treatment, we conducted an ordinal logistic regression to analyze the relationship between the perceived differences in legal accountability and moral responsibility, on the one hand, and the “equal risk + responsibility” treatment, on the other hand. The results indicate that the responsibility prime had no statistically significant effect on the perceived differences in legal accountability and moral responsibility (see Appendix 10).

87 OR = 0.991, p = 0.919 for legal accountability; OR = 0.979, p = 0.806 for moral responsibility. These findings hold even when accounting for other factors. See Appendix 10.

88 Older respondents were slightly more likely to participate in this second wave. See Appendix 17 for an overview of survey attrition.

89 p < 0.05 in Model 1 and 3, and p < 0.01 in Model 2.

90 In Appendix 11, we show that the results are robust to the exclusion of participants who failed the manipulation check, use of alternative estimation techniques, and inclusion of controls for our experimental treatments, but not to the use of sampling weights. This could be attributed to survey attrition, as there was a slight underrepresentation of younger participants.

91 The results, available in Appendix 12, reveal that the “human dignity concern” fails to reach statistical significance in the “unequal risk” and “highly unequal risk” groups.

92 Of particular note, neither the term “dignity” itself nor any references to the treatment of humans as mere objects were mentioned in the write-in responses to the open-ended question.

93 In Appendix 13 and 14, we further show that those who believed that being killed by LAWS is less ethical were more likely to prefer remote-controlled drones, whereas those who believed that being killed by LAWS is more undignified were neither more nor less likely to prefer them.

94 The risk of target misidentification pertains to the probability of mistakenly identifying civilians as targets. Military effectiveness relates to the likelihood of achieving mission objectives. Costs refer to the degree of expense incurred. Force restraint refers to the extent to which decision-makers will feel constrained in using military force in future situations.

95 This analysis focused solely on the respondents from the control group who answered the preference question after expressing their opinion on the perceived differences. To facilitate interpretation, we recoded the risk of target misidentification, costs, and ethicality such that higher values on all eight measures indicate a belief that autonomous drones are better in this aspect.

96 Rosert and Sauer, “Prohibiting Autonomous Weapons,” 372; Rosert and Sauer, “How (not) to Stop the Killer Robots,” 22.

Additional information

Funding

Univerzita Karlova v Praze.

Notes on contributors

Ondřej Rosendorf

Ondřej Rosendorf is a doctoral student at the Faculty of Social Sciences, Charles University, and a Researcher at the Institute for Peace Research and Security Policy at the University of Hamburg.

Michal Smetana

Michal Smetana is an Associate Professor at the Faculty of Social Sciences, Charles University, and the Head of the Peace Research Center Prague.

Marek Vranka

Marek Vranka is an Assistant Professor at the Faculty of Social Sciences, Charles University, and a Researcher at the Peace Research Center Prague.