Reference list
- 2017 Open Letter to the United Nations Convention on Certain Conventional Weapons. (2017). Retrieved from https://www.dropbox.com/s/g4ijcaqq6ivq19d/2017%20Open%20Letter%20to%20the%20United%20Nations%20Convention%20on%20Certain%20Conventional%20Weapons.pdf?dl=0
- Adler, E. (1992). The emergence of cooperation: National epistemic communities and the international evolution of the idea of nuclear arms control. International Organization, 46, 101–145. doi: 10.1017/S0020818300001466
- Allen, G., & Kania, E. (2017, September 8). China is using America’s own plan to dominate the future of artificial intelligence. Foreign Policy. Retrieved from https://foreignpolicy.com/2017/09/08/china-is-using-americas-own-plan-to-dominate-the-future-of-artificial-intelligence/
- Allison, G. (2010, January/February). Nuclear disorder. Foreign Affairs. Retrieved from https://www.foreignaffairs.com/articles/pakistan/2010-01-01/nuclear-disorder
- Amodei, D., & Hernandez, D. (2018, May 16). AI and compute. Retrieved from https://blog.openai.com/ai-and-compute/
- Auslin, M. (2018, October 19). Can the pentagon win the AI arms race? Foreign Affairs. Retrieved from https://www.foreignaffairs.com/articles/united-states/2018-10-19/can-pentagon-win-ai-arms-race
- Ayoub, K., & Payne, K. (2016). Strategy in the age of artificial intelligence. Journal of Strategic Studies, 39, 793–819. doi: 10.1080/01402390.2015.1088838
- Baratta, J. P. (2004). The politics of world federation. Vol. 1: The United Nations, U.N. reform, atomic control. Westport: Praeger.
- Barnes, J. E., & Chin, J. (2018, March 2). The new arms race in AI. Wall Street Journal. Retrieved from https://www.wsj.com/articles/the-new-arms-race-in-ai-1520009261
- Baum, S., de Neufville, R., & Barrett, A. (2018). A model for the probability of nuclear war (SSRN Scholarly Paper No. ID 3137081). Rochester, NY: Social Science Research Network. Retrieved from https://papers.ssrn.com/abstract=3137081
- Borghard, E. D., & Lonergan, S. W. (2018, January 16). Why are there no cyber arms control agreements? Retrieved from https://www.cfr.org/blog/why-are-there-no-cyber-arms-control-agreements
- Borrie, J. (2014). A limit to safety: Risk, “normal accidents”, and nuclear weapons. ILPI-UNIDIR Vienna Conference Series. Retrieved from http://www.isn.ethz.ch/Digital-Library/Publications/Detail/?ots591=0c54e3b3-1e9c-be1e-2c24-a6a8c7060233&lng=en&id=186094
- Borrie, J. (2016). Safety, unintentional risk and accidents in the weaponization of increasingly autonomous technologies (UNIDIR Resources No. 5). Geneva. UNIDIR. Retrieved from http://www.unidir.org/files/publications/pdfs/safety-unintentional-risk-and-accidents-en-668.pdf
- Boulanin, V., & Verbruggen, M. (2017). Mapping the development of autonomy in weapon systems. Stockholm: Stockholm International Peace Research Institute. Retrieved from https://www.sipri.org/sites/default/files/2017-11/siprireport_mapping_the_development_of_autonomy_in_weapon_systems_1117_1.pdf
- Carranza, M. E. (2018). Deterrence or taboo? Explaining the non-use of nuclear weapons during the Indo-Pakistani post-tests nuclear crises. Contemporary Security Policy, 39, 441–463. doi: 10.1080/13523260.2017.1418725
- Cave, S., & Ó hÉigeartaigh, S. S. (2018). An AI race for strategic advantage: Rhetoric and risks. In AAAI / ACM Conference on Artificial Intelligence, Ethics and Society. Retrieved from http://www.aies-conference.com/wp-content/papers/main/AIES_2018_paper_163.pdf
- China’s State Council. (2017). A next generation artificial intelligence development plan (R. Creemers, G. Webster, P. Triolo, & E. Kania, Trans.). New America Cybersecurity Initiative. Retrieved from https://na-production.s3.amazonaws.com/documents/translation-fulltext-8.1.17.pdf
- Crootof, R. (2015). A meaningful floor for “meaningful human control”. Temple International and Comparative Law Journal, 30, 53–62.
- Dafoe, A. (2018). AI governance: A research agenda. Oxford: Governance of AI Program, Future of Humanity Institute. Retrieved from https://www.fhi.ox.ac.uk/govaiagenda/
- Danzig, R. (2018). Technology roulette: Managing loss of control as many militaries pursue technological superiority. Washington, DC: Center for a New American Security. Retrieved from https://s3.amazonaws.com/files.cnas.org/documents/CNASReport-Technology-Roulette-DoSproof2v2.pdf?mtime=20180628072101
- Davison, N. (2017). A legal perspective: Autonomous weapon systems under international humanitarian law. In Perspectives on lethal autonomous weapon systems (pp. 5–18). New York: UNODA.
- Deep Cuts Commission. (2018, March 19). Urgent steps to avoid a new nuclear arms race and dangerous miscalculation – Statement of the deep cuts commission. Retrieved from https://www.armscontrol.org/sites/default/files/files/documents/DCC_1804018_FINAL.pdf
- Drum, K. (2018, August). Tech world: Welcome to the digital revolution. Foreign Affairs.
- European Parliament. (2018, September 12). P8_TA-PROV(2018)0341: European Parliament resolution of 12 September 2018 on autonomous weapon systems (2018/2752(RSP)). European Parliament. Retrieved from http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//NONSGML+TA+P8-TA-2018-0341+0+DOC+PDF+V0//EN
- Fatton, L. P. (2016). The impotence of conventional arms control: Why do international regimes fail when they are most needed? Contemporary Security Policy, 37, 200–222. doi: 10.1080/13523260.2016.1187952
- Fischer, S.-C. (2017). The role of the private sector in the governance of autonomous weapon systems: A principal-agent perspective. Presented at the We Robot 2017. Retrieved from http://www.werobot2017.com/wp-content/uploads/2017/03/Fischer-The-role-of-the-private-sector-in-the-governance-of-autonomous-weapon-systems-1.pdf
- Fuhrmann, M., & Lupu, Y. (2016). Do arms control treaties work? Assessing the effectiveness of the nuclear nonproliferation treaty. International Studies Quarterly, 60, 530–539. doi: 10.1093/isq/sqw013
- Garnett, S. W. (2012). The “Model” of Ukrainian Denuclearization. In Jeffrey W. Knopf (Ed.), Security assurances and nuclear nonproliferation (pp 246–274). Stanford, CA: Stanford University Press.
- Geist, E., & Lohn, A. J. (2018). How might artificial intelligence affect the risk of nuclear war? (p. 28). Web. RAND. Retrieved from https://www.rand.org/pubs/perspectives/PE296.html
- Geist, E. M. (2016). It’s already too late to stop the AI arms race—We must manage it instead. Bulletin of the Atomic Scientists, 72, 318–321. doi: 10.1080/00963402.2016.1216672
- Goodfellow, I., Papernot, N., Huang, S., Duan, Y., Abbeel, P., & Clark, J. (2017, February 16). Attacking machine learning with adversarial examples. Retrieved from https://openai.com/blog/adversarial-example-research/
- Guterres, A. (2018, November). Allocution du Secrétaire général au Forum de Paris sur la paix. Paris. Retrieved from https://www.un.org/sg/en/content/sg/statement/2018-11-11/allocution-du-secr%C3%A9taire-g%C3%A9n%C3%A9ral-au-forum-de-paris-sur-la-paix
- Haas, P. M. (1992). Introduction: Epistemic communities and international policy coordination. International Organization, 46(1), 1–35. doi: 10.1017/S0020818300001442
- Hogarth, I. (2018, June 13). AI nationalism. Retrieved from https://www.ianhogarth.com/blog/2018/6/13/ai-nationalism
- Horowitz, M. C. (2016). Public opinion and the politics of the killer robots debate. Research & Politics, 3(1). doi: 10.1177/2053168015627183
- Horowitz, M. C. (2018a, May 15). Artificial intelligence, international competition, and the balance of power. Texas National Security Review. Retrieved from https://tnsr.org/2018/05/artificial-intelligence-international-competition-and-the-balance-of-power/
- Horowitz, M. C. (2018b, September 12). The algorithms of august. Foreign Policy. Retrieved from https://foreignpolicy.com/2018/09/12/will-the-united-states-lose-the-artificial-intelligence-arms-race/
- Horowitz, M. C., & Scharre, P. (2015). Meaningful human control in weapon systems: A primer. Center for a New American Security. Retrieved from https://s3.amazonaws.com/files.cnas.org/documents/Ethical_Autonomy_Working_Paper_031315.pdf?mtime=20160906082316
- Human Rights Watch. (2012). Losing humanity: The case against killer robots. Amsterdam/Berlin: Human Rights Watch. Retrieved from https://www.hrw.org/sites/default/files/reports/arms1112_ForUpload.pdf
- Human Rights Watch. (2016, April 11). Killer robots and the concept of meaningful human control. Retrieved from https://www.hrw.org/news/2016/04/11/killer-robots-and-concept-meaningful-human-control
- Hwang, T. (2018). Computational power and the social impact of artificial intelligence (SSRN Scholarly Paper No. ID 3147971). Rochester, NY: Social Science Research Network. Retrieved from https://papers.ssrn.com/abstract=3147971
- IPSOS. (2019). Six in ten (61%) respondents across 26 countries oppose the use of lethal autonomous weapons systems. Retrieved from https://www.ipsos.com/sites/default/files/ct/news/documents/2019-01/human-rights-watch-autonomous-weapons-pr-01-22-2019_0.pdf
- Jo, D.-J., & Gartzke, E. (2007). Determinants of nuclear weapons proliferation. Journal of Conflict Resolution, 51, 167–194. doi: 10.1177/0022002706296158
- Jones, P. F. (1969, October 22). Goldsboro revisited: Account of hydrogen bomb near-disaster over North Carolina – declassified document. Retrieved from http://www.theguardian.com/world/interactive/2013/sep/20/goldsboro-revisited-declassified-document
- Joshi, S. (2019, January 17). Autonomous weapons and the new laws of war. The Economist. Retrieved from https://www.economist.com/briefing/2019/01/19/autonomous-weapons-and-the-new-laws-of-war
- Kahn, H. (1969). The missile defense debate in perspective. In J. J. Holst & W. Schneider (Eds.), Why ABM: Policy issues in the missile defense controversy (pp. 285–294). Pergamon. doi: 10.1016/B978-0-08-015625-5.50017-1
- Kania, E. (2018, April 19). The pursuit of AI is more than an arms race. Defense One. Retrieved from https://www.defenseone.com/ideas/2018/04/pursuit-ai-more-arms-race/147579/
- Kania, E. B. (2017). Battlefield singularity: Artificial intelligence, military revolution, and China’s future military power. Washington, DC: Center for a New American Security. Retrieved from https://s3.amazonaws.com/files.cnas.org/documents/Battlefield-Singularity-November-2017.pdf?mtime=20171129235804
- Knopf, J. W. (2018). After diffusion: Challenges to enforcing nonproliferation and disarmament norms. Contemporary Security Policy, 39, 367–398. doi: 10.1080/13523260.2018.1431446
- Krasner, S. D. (1982). Structural causes and regime consequences: Regimes as intervening variables. International Organization, 36, 185–205. doi: 10.1017/S0020818300018920
- Kroenig, M., & Gopalaswamy, B. (2018, November 12). Will disruptive technology cause nuclear war? Retrieved from https://thebulletin.org/2018/11/will-disruptive-technology-cause-nuclear-war/
- Lantis, J. S. (2018). Nuclear cooperation with non-NPT member states? An elite-driven model of norm contestation. Contemporary Security Policy, 39, 399–418. doi: 10.1080/13523260.2017.1398367
- Lee, K.-F. (2018). AI superpowers: China, silicon valley, and the new world order. Boston: Houghton Mifflin Harcourt.
- Letter to Google C.E.O. (2018). Retrieved from https://static01.nyt.com/files/2018/technology/googleletter.pdf
- Lewis, P., Williams, H., Pelopidas, B., & Aghlani, S. (2014). Too close for comfort: Cases of near nuclear use and options for policy. London: Chatham House.
- Lieber, K. A., & Press, D. G. (2017). The new era of counterforce: Technological change and the future of nuclear deterrence. International Security, 41(4), 9–49. doi: 10.1162/ISEC_a_00273
- Maas, M. (2018). Regulating for ‘normal AI accidents’—Operational lessons for the responsible governance of AI deployment. Presented at the AAAI / ACM Conference on Artificial Intelligence, Ethics and Society, New Orleans: Association for the Advancement of Artificial Intelligence. Retrieved from http://www.aies-conference.com/wp-content/papers/main/AIES_2018_paper_118.pdf
- Maas, M., Sweijs, T., & De Spiegeleire, S. (2017). Artificial intelligence and the future of defense: Strategic implications for small- and medium-sized force providers. The Hague, The Netherlands: The Hague Centre for Strategic Studies. Retrieved from http://hcss.nl/report/artificial-intelligence-and-future-defense
- MacKenzie, D., & Spinardi, G. (1995). Tacit knowledge, weapons design, and the uninvention of nuclear weapons. American Journal of Sociology, 101(1), 44–99. doi: 10.1086/230699
- McNamara, R. (1963, February 12). “The diffusion of nuclear weapons with and without a test ban agreement.” Memorandum for the President, Washington, DC, Department of Defense, 12 February 1963, Digital National Security Archive (DNSA), document no. NP00941.
- Müller, H., & Schmidt, A. (2010). The little known story of de-proliferation: Why states give up nuclear weapon activities. In W. C. Potter & G. Mukhatzhanova (Eds.), Forecasting nuclear proliferation in the 21st century. The role of theory (Vol. 1, pp. 124–158). Stanford, CA: Stanford University Press.
- Nehal, B., Beck, S., Geiss, R., Liu, H.-Y., & Kress, K. (Eds.). (2016). Autonomous weapons systems: Law, ethics, policy. Cambridge: Cambridge University Press.
- Open Roboethics Initiative. (2015). The ethics and governance of lethal autonomous weapons systems: An international public opinion poll. Open Roboethics Initiative. Retrieved from http://www.openroboethics.org/wp-content/uploads/2015/11/ORi_LAWS2015.pdf
- Payne, K. (2018a). Artificial intelligence: A revolution in strategic affairs? Survival, 60(5), 7–32. doi: 10.1080/00396338.2018.1518374
- Payne, K. (2018b). Strategy, evolution, and war: From apes to artificial intelligence. Washington, DC: Georgetown University Press. Retrieved from http://ebookcentral.proquest.com/lib/kbdk/detail.action?docID=5394997
- Pelopidas, B. (2011). The oracles of proliferation: How experts maintain a biased historical reading that limits policy innovation. The Nonproliferation Review, 18, 297–314. doi: 10.1080/10736700.2011.549185
- Perrow, C. (1984). Normal accidents: Living with high risk technologies. Retrieved from http://press.princeton.edu/titles/6596.html
- Picker, C. B. (2001). A view from 40,000 feet: International law and the invisible hand of technology. Cardozo Law Review, 23, 151–219.
- Pinker, S. (2011). The better angels of our nature: Why violence has declined. New York, NY: Viking.
- Putin, V. (2017, September 1). Открытый урок «Россия, устремлённая в будущее” | Open Lesson “Russia moving towards the future.” Retrieved from http://kremlin.ru/events/president/news/55493
- Putin, V. (2018). 2018 presidential address to the federal assembly. President of Russia. Retrieved from http://en.kremlin.ru/events/president/news/56957
- Rickli, J.-M. (2017). Artificial intelligence and the future of warfare. In WEF global risks report 2017 (p. 49). Retrieved from http://www3.weforum.org/docs/GRR17_Report_web.pdf
- Roff, H. M. (2014). The strategic robot problem: Lethal autonomous weapons in war. Journal of Military Ethics, 13, 211–227. doi: 10.1080/15027570.2014.975010
- Roff, H. M. (2017, February 8). What do people around the world think about killer robots? Slate. Retrieved from http://www.slate.com/articles/technology/future_tense/2017/02/what_do_people_around_the_world_think_about_killer_robots.html
- Rublee, M. R., Bertsch, G., & Wiarda, H. (2009). Nonproliferation norms: Why states choose nuclear restraint. Athens, GA: University of Georgia Press.
- Rublee, M. R., & Cohen, A. (2018). Nuclear norms in global governance: A progressive research agenda. Contemporary Security Policy, 39, 317–340. doi: 10.1080/13523260.2018.1451428
- Sagan, S. D. (1993). The limits of safety: Organizations, accidents, and nuclear weapons. Princeton, NJ: Princeton University Press.
- Sagan, S. D. (1996). Why do states build nuclear weapons?: Three models in search of a bomb. International Security, 21(3), 54–86. doi: 10.2307/2539273
- Sagan, S. D. (2011). The causes of nuclear weapons proliferation. Annual Review of Political Science, 14, 225–244. doi: 10.1146/annurev-polisci-052209-131042
- Sanger, D. E., & Broad, W. J. (2018, October 20). U.S. to tell Russia it is leaving landmark I.N.F. treaty. The New York Times. Retrieved from https://www.nytimes.com/2018/10/19/us/politics/russia-nuclear-arms-treaty-trump-administration.html
- Scharre, P. (2016a). Autonomous weapons and operational risk (Ethical Autonomy Project. 20YY Future of Warfare Initiative). Washington, DC: Center for a New American Security. Retrieved from https://s3.amazonaws.com/files.cnas.org/documents/CNAS_Autonomous-weapons-operational-risk.pdf
- Scharre, P. (2016b, April). Flash war – Autonomous weapons and strategic stability. Presented at the Understanding Different Types of Risk, Geneva. Retrieved from http://www.unidir.ch/files/conferences/pdfs/-en-1-1113.pdf
- Scharre, P. (2018a). Army of none: Autonomous weapons and the future of war (1st ed.). New York, NY: W. W. Norton & Company.
- Scharre, P. (2018b, September 12). A million mistakes a second. Foreign Policy. Retrieved from https://foreignpolicy.com/2018/09/12/a-million-mistakes-a-second-future-of-war/
- Scherer, M. U. (2016). Regulating artificial intelligence systems: Risks, challenges, competencies, and strategies. Harvard Journal of Law & Technology, (2). Retrieved from http://jolt.law.harvard.edu/articles/pdf/v29/29HarvJLTech353.pdf
- Schlosser, E. (2014). Command and control: Nuclear weapons, the damascus accident, and the illusion of safety (Reprint ed.). New York, NY: Penguin Books.
- Schneider, J. (2016). Digitally-enabled warfare: The capability-vulnerability paradox (p. 15). Washington, DC: Center for a New American Security. Retrieved from https://www.cnas.org/publications/reports/digitally-enabled-warfare-the-capability-vulnerability-paradox
- Shultz, G. P. (1984, November 1). Preventing the proliferation of nuclear weapons. Washington, DC: U.S. Department of State, Bureau of Public Affairs, Office of Public Communication, Editorial Division.
- Solingen, E. (1994). The political economy of nuclear restraint. International Security, 19(2), 126–159. doi: 10.2307/2539198
- Thompson, N., & Bremmer, I. (2018, October 23). The AI cold war that threatens us all. Wired. Retrieved from https://www.wired.com/story/ai-cold-war-china-could-doom-us-all/
- van der Meer, S. (2011). Not that bad: Looking back on 65 years of nuclear non-proliferation efforts. Security and Human Rights, 22(1), 37–47. doi: 10.1163/187502311796365862
- van der Meer, S. (2014). Forgoing the nuclear option: States that could build nuclear weapons but chose not to do so. Medicine, Conflict and Survival, 30(S1), s27–s34. doi: 10.1080/13623699.2014.930238
- Verbruggen, M. (2018, November). Breaking out of the silos: The need for a whole-of-disarmament approach to Arms Control of AI. Conference talk presented at the Beyond Killer Robots: Networked Artificial Intelligence Disrupting the Battlefield, Copenhagen, Denmark.
- West, D. M. (2018, August 29). Brookings survey finds divided views on artificial intelligence for warfare, but support rises if adversaries are developing it. Retrieved from https://www.brookings.edu/blog/techtank/2018/08/29/brookings-survey-finds-divided-views-on-artificial-intelligence-for-warfare-but-support-rises-if-adversaries-are-developing-it/
- Williamson, R. (2003). Hard law, soft law, and non-law in multilateral arms control: Some compliance hypotheses. Chicago Journal of International Law, 4(1). Retrieved from https://chicagounbound.uchicago.edu/cjil/vol4/iss1/7
- Work, B. (2015, December 14). Deputy secretary of defense bob work’s speech at the CNAS defense forum. Retrieved from http://www.defense.gov/News/Speeches/Speech-View/Article/634214/cnas-defense-forum
- Yusuf, M. (2009). Predicting proliferation: The history of the future of nuclear weapons. Washington, DC: Brookings Institute.
- Zhang, B., & Dafoe, A. (2019). Artificial intelligence: American attitudes and trends. Oxford: Center for the Governance of AI, Future of Humanity Institute, University of Oxford. Retrieved from https://governanceai.github.io/US-Public-Opinion-Report-Jan-2019/
- Zwetsloot, R., Toner, H., & Ding, J. (2018, November 16). Beyond the AI Arms Race: America, China, and the dangers of zero-sum thinking. Foreign Affairs. Retrieved from https://www.foreignaffairs.com/reviews/review-essay/2018-11-16/beyond-ai-arms-race