9,050
Views
34
CrossRef citations to date
0
Altmetric
Amos Perlmutter Prize Essay

A matter of time: On the transitory nature of cyberweapons

 

ABSTRACT

This article examines the transitory nature of cyberweapons. Shedding light on this highly understudied facet is important both for grasping how cyberspace affects international security and policymakers’ efforts to make accurate decisions regarding the deployment of cyberweapons. First, laying out the life cycle of a cyberweapon, I argue that these offensive capabilities are both different in ‘degree’ and in ‘kind’ compared with other regarding their temporary ability to cause harm or damage. Second, I develop six propositions which indicate that not only technical features, inherent to the different types of cyber capabilities – that is, the type of exploited vulnerability, access and payload – but also offender and defender characteristics explain differences in transitoriness between cyberweapons. Finally, drawing out the implications, I reveal that the transitory nature of cyberweapons benefits great powers, changes the incentive structure for offensive cyber cooperation and induces a different funding structure for (military) cyber programs compared with conventional weapon programs. I also note that the time-dependent dynamic underlying cyberweapons potentially explains the limited deployment of cyberweapons compared to espionage capabilities.

This article is part of the following collections:
The Amos Perlmutter Prize

Acknowledgements

For written comments on early drafts, the author is indebted to Graham Fairclough, Trey Herr, Lucas Kello, Joseph Nye Jr., Taylor Roberts, James Shires, and an anonymous reviewer. An earlier version of this paper was presented at ISA, Atlanta (2016), and the IR Colloquium at the University of Oxford (2016)

Disclosure statement

No potential conflict of interest was reported by the author.

Notes

1 As cyberweapons contains non-physical elements, it makes more sense to talk about cyberweapons as a capability rather than a tool or instrument.

2 One should not confuse the concept with the notion that the effects of the weapon are potentially temporary in nature.

3 Indeed, the latest developed nuclear weapons of the United States are about 25 years old.

4 According to Libicki, the transitory nature of cyberweapons leads to less trigger-happy actors: ‘like surprise, it is best saved for when it is most needed’. Krepinevich directly opposes this view arguing that it creates a “use-it-or-lose-it dynamic” and might encourage a cyber power to launch an attack before its advantage is lost.” Axelrod and Iliev reconcile the contrasting views and argue that the degree to which a cyberweapons incentivises a ‘use-it-or lose-it dynamic’ or a ‘waiting-for-the-right-moment dynamic’ depends on the type of capability and whether the stakes remain constant. See: Martin C. Libicki, Conquest in Cyberspace: National Security and Information Warfare (Cambridge: Cambridge University Press 2007), 87; Andrew Krepinevich, ‘Cyber Warfare: a “nuclear option”?’, Center for Strategic and Budgetary Assessments, 2012, <http://www.csbaonline.org/wp-content/uploads/2012/08/CSBA_Cyber_Warfare_For_Web_1.pdf>; Robert Axelrod and Rumen Iliev, ‘Timing of cyber conflict’, PNAS, 111/4 (2014), 1298–1303.

5 According to Gartzke, it means reduces the incentives to invest in ‘cyberwar assets’. Erik Gartzke, ‘The Myth of Cyberwar’, International Security, 38/2 (2013), 41–73, 59–60. Also see: James A. Lewis, ‘Conflict and Negotiation in Cyberspace’, The Technology and Public Policy Program, 2013, <http://csis.org/files/publication/130208_Lewis_ConflictCyberspace_Web.pdf>.

6 For a similar point see: David A. Baldwin, ‘The Concept of Security’, Review of International Studies 23 (1997), 5–26.

7 In statistics, it would called omitted-variable bias.

8 The work of the National Academy of Sciences is a good example of this trend. It states that weapons have three characteristics that differentiate them from traditional kinetic weapons. First, ‘they are easy to use with high degrees of anonymity and with plausible deniability, making them well suited for covert operations and for instigating conflict between other parties’. Second, they ‘are more uncertain in the outcomes they produce, making it difficult to estimate deliberate and collateral damage’. And, third, they ‘involve a much larger range of options and possible outcomes, and may operate on time scales ranging from tenths of a second to years, and at spatial scales anywhere from’ concentrated in a facility next door‘ to globally dispersed’. The study leaves out any discussion on the notion of transitoriness. See: William A. Owens, Kenneth W. Dam and Herbert S. Lin (eds.), ‘Excerpts from Technology, Policy, Law and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities’, National Research Council, 2009; S-1, Section 1.4 and 2.1.

9 Lewis, ‘Conflict and Negotiation in Cyberspace’; Krepinevich, Cyber Warfare

10 Leyla Bilge and Tudor Dumitras, ‘Before we knew it: an empirical study of zero-day attacks in the real world’, CSS’12, Oct. 2012; Ratinder Kaur and Maninder Singh, ‘A Survey on Zero-Day Polymorphic Worm Detection Techniques’, IEEE Communications surveys & tutorials 16/3 (2014).

11 A more detailed discussion on this issue will be provided below.

12 Gartzke, ‘The Myth of Cyberwar’

13 Andrew Sweeting, ‘Equilibrium Price Dynamics in Perishable Goods Markets: The Case of Secondary Markets for Major League Baseball Tickets’, NBER, Working Paper 14505, (2008)

14 As a former U.S. executive at a defence contractor said to a reporter from Reuters: ‘My job was to have 25 zero-days on a USB stick, ready to go’. See: Joseph Menn, ‘Special Report: U.S. Cyber war Strategy Fear of Blowback’, Reuters, May 2013, <http://www.reuters.com/article/2013/05/10/us-usa-cyberweapons-specialreport-idUSBRE9490EL20130510>.

15 Axelrod and Iliev, ‘Timing of Cyber Conflict’; It follows model developed in: Robert Axelrod, ‘The Rational Timing of Surprise’, World Politics 31/2 (1979), 228–246.

16 Collins English Dictionary (online), ‘transitory’, <http://www.collinsdictionary.com/dictionary/English>.

17 Random House Webster’s Unabridged Dictionary (online), ‘transitory’, <http://dictionary.reference.com/browse/transitory>.

18 John Keegan, A History of Warfare (Random House: London 1994)

19 Philip E. Auerswald, Christian Duttweiler, and John Garofano, Clinton’s Foreign Policy: A Documentary Record (The Hague: Kluwer Law International 2003), 73.

20 The United States Department of Energy, for example, has set up the ‘Stockpile Stewardship and Management Program’ with as aim to maintain a reliable stockpile using various simulations and applications from the scientific community to deal with the particular issue of an aging capability.

21 The literature on International Relations and military history contains numerous references to the offensive or defensive balance of military technology. Yet, these discussions have focused on the degree to which the offense has an advantage over the defence – and the strategic implications of it. Scholars rarely focus on how specific defence measures catch up on offensive measures. For an overview see: Jack S. Levy, ‘The Offensive/Defensive Balance of Military Technology: A Theoretical and Historical Analysis’, International Studies Quarterly 28 (1984), 219–238.

22 According to Shachtman and Singer, ‘[c]yberspace is a man-made domain of technological commerce and communication, not a geographical chessboard of competing alliances’. Noah Shachtman and Peter W. Singer, ‘The Wrong War: The Insistence on Applying Cold War Metaphors to Cybersecurity Is Misplaced and Counterproductive’, Brookings Institute, Aug. 2011, <http://www.brookings.edu/research/articles/2011/08/15-cybersecurity-singer-shachtman>. For cyber strategies, so for example: Presidency of the Council of Ministers Italy, ‘National Strategic Framework for the Security of Cyberspace’, Dec. 2013, <http://www.sicurezzanazionale.gov.it/sisr.nsf/wp-content/uploads/2014/02/italian-national-strategic-framework-for-cyberspace-security.pdf>.

23 Michael V. Hayden, ‘The Future of Things Cyber’, Strategic Studies Quarterly 5/1 (2011)

24 See Dorothy E. Denning, ‘Rethinking the Cyber Domain and Deterrence’, JFQ 77 (2015).

25 As Robert Bartlett states, ‘[a]nyone who understands how to read and write code is capable of rewriting the instructions that define the possible’. Robert Bartlett, ‘Developments in the Law-The Law of Cyberspace’, Harvard Law Review 112/1574 (1999), 1635.

26 Martin C. Libicki, ‘Cyberspace Is Not a Warfighting Domain’, A Journal of Law and Policy for the Information Society 8/2 (2012), 326; As the discussion below on the window of vulnerability indicates, these ‘corrections’ can take place both before and after a cyberattack.

27 Bruce Schneier, ‘Crypto-Gram’, Sep. 2000, <https://www.schneier.com/crypto-gram/archives/2000/0915.html>.

28 Others have referred to the ‘window of exposure’ as the ‘lifecycle of a vulnerability’. I use these terms interchangeably (including the term the ‘life cycle of cyberweapons’ effectiveness’ as well). Stefan Frei, Bernhard Tellenbach, and Bernhard Plattner, ‘0-Day Patch: Exposing Vendors (In)security Performance’, BlackHat Europe, 2008, <https://www.blackhat.com/presentations/bh-europe-08/Frei/Whitepaper/bh-eu-08-frei-WP.pdf>; Adrian Pauna and Konstantinos Moulinos, ‘Window of exposure… a real problem for SCADA systems? Recommendations for Europe on SCADA patching’, European Union Agency for Network and Information Security Publication, Dec. 2013.

29 Yet, some ordering is determined. The introduction of the vulnerability always precedes (or equal to) the time of exploitation of it. And the release of a patch can only occur after the vendor has become aware of the vulnerability. For a more detailed discussion see the recent study of: Antonio Nappa, Richard Johnson, Leyla Bilge, Juan Caballero, Tudor Dumitras, ‘The Attack of the Clones: A Study of the Impact of Shared Code on Vulnerability Patching’, IEEE Symposium on Security and Privacy, 2015.

30 Robert F. Dacey, ‘Information security progress made, but challenges remain to protect federal systems and the nation’s critical infrastructures’, Government Accountability Office, 2003, <http://world.std.com/~goldberg/daceysecurity.pdf>.

31 Sam Ransbotham, Sabyasachi Mitra, and Jon Ramsey, ‘Are Markets for Vulnerabilities Effective?’ ICIS, 2008, <http://aisel.aisnet.org/cgi/viewcontent.cgi?article=1192&context=icis2008>.

32 Frei, Tellenbach, and Plattner note that the disclosure of a vulnerability has been defined in a number of ways, ranging from ‘made public to wider audience’, ‘made public through forums or by vendor’, ‘made public by anyone before vendor releases a patch’. See Frei, Tellenbach, and Plattner. ‘0-Day Patch’.

33 Ashish Arora, Ramayya Krishnan, Anand Nandkumar, Rahul Telang and Yubao Yang, ‘Impact of Vulnerability Disclosure and Patch Availability – An Empirical Analysis’, Workshop on the Economics of Information Security, 2004; Hasan Cavusoglu, Huseyin Cavusoglu, and Srinivasan Raghunathan, ‘Efficiency of Vulnerability Disclosure Mechanisms to Disseminate Vulnerability Knowledge’, IEEE Transactions on Software Engineering 33/3 (2007),171–185.

34 Frei and Plattner, ‘0-Day Patch’

35 The testing of patches is required before being applied to the production environment to make sure that it works properly and does not conflict with other applications in the system. Hasan Cavusoglu, Huseyin Cavusoglu, and Jun Zhang, ‘Security Patch Management: Share the Burden or Share the Damage?’ Management Science 54/4 (2008), 657–670.

36 Steve Beattie, Seth Arnold, Crispin Cowan, Perry Wagle, and Chris Wright, ‘Timing the Application of Security Patches for Optimal Uptime’, LISA XVI, Nov. 2002; Also see: Ross Anderson and Tyler Moore, ‘The Economics of Information Security’, Science 314/5799 (2006), 610–613.

37 Greg Shipley, ‘Painless (well, almost) patch management procedures’, Network Computer 2004, <http://www. networkcomputing. com/showitem.jhtml?docid = 1506f1>.

38 In an average week, vendors and security organisations announce around 150 vulnerabilities along with information on how to fix them. Cavusoglu, Cavusoglu, and Zhang, ‘Security Patch Management’.

39 Gary Armstrong, Stewart Adam, Sara Denize, Philip Kotler, Principles of Marketing, (Pearson: Melbourne 2015).

40 The worm was also special in that it was first time a worm was released in the wild through a bot network of about 100 infected machines. It meant that every available host was very quickly infected. Bruce Schneier, ‘The Witty Worm a New Chapter in Malware’, Computer World, Jun. 2014, <http://www.computerworld.com/article/2565119/malware-vulnerabilities/the-witty-worm–a-new-chapter-in-malware.html>.

41 Most research assumes a linear model for the cyberweapon life cycle, which is critiqued in more recent scholarship. As the goal of this article is not to estimate the exploitation of a vulnerability or the average deployment of a patch, I do not make any unnecessary assumptions about this. William A. Arbaugh, William L. Fithen, and John McHugh, ‘Windows of vulnerability: A case study analysis’, IEEE Computer 33/12 (2000); Hamed Okhravi and David Nicol, ‘Evaluation of patch management strategies’, International Journal of Computational Intelligence: Theory and Practice 3/2(2008), 109–117; Terry Ramos, ‘The Laws of Vulnerabilities’, RSA Conference, Feb. 2006.

42 Bilge and Dumitras, ‘Before We Knew It’.

43 The three events are not collapsed if the chipmaker itself has introduced the backdoor. Although still unconfirmed, this was likely the case with backdoor in a computer chip used in military systems and aircraft, discovered by two experts from Cambridge University. See: Charles Arthur, ‘Cyberattack concerns raised over Boeing 787 chip’s “back door”’, The Guardian, May 2012, <http://www.theguardian.com/technology/2012/may/29/cyber-attack-concerns-boeing-chip>.

44 Package management systems can offer various degrees of patch automation. Completely automatic updates are still rife with problems, so are not widely adopted. Cavusoglu, Cavusoglu and Zhang, ‘Security Patch Management’.

45 For a more detailed discussion on the degree to which the time between these different effects determines the security risk exposure, see: Frei and Plattner, ‘0-Day Patch’.

46 Hence, I would argue that a better way to model the transitory nature a cyberweapon is similar to the Black-Scholes model of options pricing in finance. A cyberweapon is analogous to what is called an ‘American option’. The value of usage could be modelled as a ‘Brownian Motion’ with random crashes representing the use of the weapon by others.

47 John H. Cochrane, ‘Permanent and Transitory Components of GNP and Stock Prices’, The Quarterly Journal of Economics 109/1 (1994), 241–265.

48 John Y. Campbell, N. Gregory Mankiw, ‘Permanent and Transitory Components in Macroeconomic Fluctuations’, NBER, 2169 (1987).

49 Christa Frei and Alfonso Sousa-Poza, ‘Overqualification: permanent or transitory?’, Applied Economics 44/14 (2012).

50 In developing these propositions I follow Lin in asserting that a successful cyberattack requires three elements; (a) a vulnerability, (b) access to the vulnerability (i.e. access vector) and (c) a payload to be executed (i.e. malicious code). Herr’s application has usefully clarified that the three conditions for cyberattacks can be successfully translated into ‘cyberweapons’, using a variety of examples. Herbert S. Lin, ‘Escalation Dynamics and Conflict Termination in Cyberspace’, Strategic Studies Quarterly 6/3 (2012), 46–70; Herbert S. Lin, ‘Offensive Cyber Operations and the Use of Force’, Journal of National Security Law and Policy 4/63(2010), 63–86; Owens, Dam, and Lin, ‘Technology, Policy, Law, and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities’; See: Trey Herr, ‘PrEP: A Framework for Malware & Cyber Weapons’, Cyber Security and Research Institute, (2014), 2.

51 Geoffrey L. Herrera, Technology and International Transformation: The Railroad, the Atom Bomb,and the Politics of Technological Change (Albany: State University of New York Press 2006).

52 Julian Jang-Jaccard and Surya Nepal, ‘A Survey of Emerging Threats in Cybersecurity’, Journal of Computer and System Sciences 80/5 (2014), 973–993.

53 Ramesh Karri, Jeyavijayan Rajendran, Kurt Rosenfeld, Mark Tehranipoor, ‘Trustworthy hardware: Identifying and classifying hardware Trojans’, Computer 43/10 (2010), 39–46; Also, as the authors indicate, IT companies often buy untrusted hardware from websites or resellers which may also contain malicious hardware-based Trojans.

54 Definition adopted from: Jaziar Radianti, and Jose. J. Gonzalez, ‘Understanding Hidden Information Security Threats: The Vulnerability Black Market’, Proceedings of the 40th Hawaii International Conference on System Sciences, 2007.

55 Most common software vulnerabilities happen as a result of exploiting software bugs in (i) the memory, (ii) user input validation, (iii) race conditions and (iv) user access privileges. See: Katrina Tsipenyuk, Brian Chess, and Gary McGraw, ‘Seven pernicious kingdoms: A taxonomy of software security errors’, Security and Privacy 3/6 (2005), 81–84.

56 See JaeSeung Song, Cristian Cadar, and Peter Pietzuch, ‘SYMBEXNET: Testing Network Protocol Implementations with Symbolic Execution and Rule-Based Specifications’, IEEEE Transactions on Software Engineering 40/7 (2013), 695–709.

57 Florida Center for Instructional Technology, ‘Chapter 2: What is a Protocol?’, 2013, <http://fcit.usf.edu/network/chap2/chap2.htm>.

58 Jang-Jaccard and Nepal, ‘A survey of emerging threats in cybersecurity’.

59 Adee, ‘The Hunt for the Kill Switch’; It means that when the chips are tested the focus is on how well it performs the functions it is destined to use for. It is impossible to check for the infinite possible issues that are not specified. It is also an incredible laborious process to test every chip.

60 Song, Cadar, Pietzuch, ‘SYMBEXNET: Testing Network Protocol Implementations with Symbolic Execution and Rule-Based Specifications’.

61 Gedare Bloom, Eugen Leontie, Bhagirath Narahari, Rahul Simha, ‘Chapter 12: Hardware and Security: Vulnerabilities and Solutions’, in Sajal K. Das, Krishna Kant and Nan Zhang (eds.), Handbook on Securing Cyber-Physical Critical Infrastructure (Waltham: Morgan Kaufmann 2012).

62 Schneier, ‘Crypto-Gram’; The Economist, ‘It’s about time: Escalating cyber-attacks’, Feb. 2014, <http://www.economist.com/blogs/babbage/2014/02/escalating-cyber-attacks>.

63 Kevin Mitnick, The Art of Deception (Hoboken: John Wiley & Sons 2002), paraphrased from: introduction; also see: Kevin Mitnick and William L. Simon, The Art of Intrusion: The Real Stories Behind the Exploits of Hackers, Intruders, & Deceivers (Ronald Madzima & Sons 2005).

64 As ‘the Grugq’, a well-known security researcher/hacker, writes: ‘Give a man an 0day and he’ll have access for a day, teach a man to phish and he’ll have access for life’. The Grugq, ‘Twitter’, 2016, <https://twitter.com/thegrugq>.

65 For an interesting recent analysis aiming to establish a rigorous data-driven approach see: V. S. Subrahmanian, Michael Ovelgönne, Tudor Dumitras, B. Aditya Prakash, ‘Chapter 4, The Global Cyber-Vulnerability Report’, in V.S. Subrahmanian, Michael Ovelgonne, Tudor Dumitras, B. Aditya Prakash (eds.), Terrorism, Security and Computation (Springer: New York City 2015).

66 Indeed, public websites are considered to be the low-hanging fruit as they generally run on generic server software and are connected to the Internet; even relatively unskilled individuals can launch a website defacement attack. See: Symantec Corporation, ‘Internet Security Threat Report 2014’, 2014, <http://www.symantec.com/content/en/us/enterprise/other_resources/b-istr_main_report_v19_21291018.en-us.pdf>.

67 Lucas Kello, ‘Cyber Disorders: Rivalry and Conflict in a Global Information Age’, Presentation, International Security Program Seminar Series, Belfer Center for Science and International Affairs, Harvard Kennedy School, May 2012, <http://belfercenter.hks.harvard.edu/files/kello-isp-cyber-disorders.pdf>.

68 Owens, Dam, Lin, ‘Technology, Policy, Law, and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities’.

69 Ibid.

70 Ibid.

71 After all, the indirect path might lead defending actors to confuse a kinetic attack for a cyberattack or accidental for purposeful harm. Part of the reason why the cyber revolution is a bone of contention is due to the indirect path in which a cyberweapon potentially causes harm or damage. As Rid writes ‘the actual use of cyber force is to be a far more complex and mediated sequence of causes and consequences that ultimately result in violence and casualties’. In those Cassandra-esque scenarios in which a cyberweapon inflicts a lot of material damage or people suffer serious injuries or be killed, ‘the causal chain that links somebody pushing a button to somebody else being hurt is mediated, delayed and permeated by chance and friction’. Thomas Rid, ‘Cyber War Will Not Take Place’, Journal of Strategic Studies 35/1 (2012), 5–32, 9.

72 Kim Zetter, ‘Hacking Team Leak shows How Secretive Zero-Day Exploit Sales Work’, Wired (Jul. 2015), <http://www.wired.com/2015/07/hacking-team-leak-shows-secretive-zero-day-exploit-sales-work/>; Andy Greenberg, ‘Shopping for Zero-Days: A Price List For Hackers’ Secret Software Exploits’, Forbes Magazine, Mar. 2012, <http://www.forbes.com/sites/andygreenberg/2012/03/23/shopping-for-zero-days-an-price-list-for-hackers-secret-software-exploits/>.

73 Joseph Menn, ‘Special Report: U.S. cyberwar strategy stokes fear of blowback’, Reuters, May 2013, <http://www.reuters.com/article/us-usa-cyberweapons-specialreport-idUSBRE9490EL20130510>.

74 David E. Sanger, ‘Obama Order Sped Up Wave of Cyberattacks Against Iran’, The New York Times, Jun. 2012, <http://www.nytimes.com/2012/06/01/world/middleeast/obama-ordered-wave-of-cyberattacks-against-iran.html?_r = 0>.

75 David E. Sanger, Confront and Conceal: Obama’s Secret Wars and Surprising use of American Power (New York: Crown Publishing 2012), 197.

76 Ibid., 198.

77 One can, for example, think of polymorphic malware, binary archives or domain generation algorithm techniques. For more detailed discussion. see: Fortinet, Head-First into the Sandbox’, 2014, <https://www.fortinet.com/sites/default/files/whitepapers/Head_First_into_the_Sandbox.pdf>.

78 See, for example, Animal Farm targeted around 3001–5000 systems. Kaspersky Lab’s Global Research & ‘Analysis Team, Animals in the APT Farm’, Securelist, Mar. 2015, <https://securelist.com/blog/research/69114/animals-in-the-apt-farm/>.

79 For sparingly used espionage capacities see CozyDuke, Wild Neutron, miniFlame, Regin, and SabPub.

80 Dan Goodin, ‘How “omnipotent” hackers tied to NSA hid for 14 years – and were found at last’, Ars Tecnica, (16 Feb. 2015), <http://arstechnica.com/security/2015/02/how-omnipotent-hackers-tied-to-the-nsa-hid-for-14-years-and-were-found-at-last/>; Kaspersky Lab’s Global Research & Analysis Team, ‘Houston, we have a problem’, SecureList, Feb. 2015, <https://securelist.com/blog/research/68750/equation-the-death-star-of-malware-galaxy>.

81 David Gilbert, ‘Equation Group: Meet the NSA “gods of cyber espionage”’, International Business Times, Feb. 2015, <http://www.ibtimes.co.uk/equation-group-meet-nsa-gods-cyber-espionage-1488327>.

82 Kaspersky Lab, ‘Equation Group, Questions and Answers’, Feb. 2015, <https://securelist.com/files/2015/02/Equation_group_questions_and_answers.pdf>; The exact number of victims is difficult to establish due to the self-destructive mechanism build into the capability.

83 Eric Byres, Andrew Ginter, and Joel Langill, ‘How Stuxnet Spreads – A Study of Infection Paths in Best Practice Systems’, Feb. 2011, <http://www.abterra.ca/papers/how-stuxnet-spreads.pdf>.

84 Ben Flanagan, ‘Former CIA chief speaks out on Iran Stuxnet attack’, The National, Dec. 2011, <http://www.thenational.ae/business/industry-insights/technology/former-cia-chief-speaks-out-on-iran-stuxnet-attack>.

85 A quote from an interview with a hacker illustrates the point. ‘The Grugq’ explains why he has no contracts with the Russian Government or other Russian actors: ‘Selling a bug to the Russian mafia guarantees it will be dead in no time, and they pay very little money’. He continues saying that: ‘Russia is flooded with criminals. They monetize exploits in the most brutal and mediocre way possible, and they cheat each other heavily’. See: Andy Greenberg, ‘Shopping For Zero-Days: A Price List For Hackers’ Secret Software Exploits’, Mar. 2012, <http://www.forbes.com/sites/andygreenberg/2012/03/23/shopping-for-zero-days-an-price-list-for-hackers-secret-software-exploits/>.

86 Kaspersky Lab’s Global Research & Analysis Team, ‘The Mystery of Duqu 2.0: a sophisticated cyberespionage actor returns’, Jun. 2015, Securelist, <https://securelist.com/blog/research/70504/the-mystery-of-duqu-2–0-a-sophisticated-cyberespionage-actor-returns/>.

87 Bruce Schneier, ‘How the NSA Attacks Tor/Firefox Users With QUANTUM and FOXACID’, Schneier on Security, Oct. 2013, <https://www.schneier.com/blog/archives/2013/10/how_the_nsa_att.html>.

88 Ibid.

89 Baldwin, ‘The Concept of Security’,  20.

90 Ibid., 21.

91 Graham T. Allison and Philip Zelikow, Essence of Decision: Explaining the Cuban Missile Crisis (Pearson Education 1999); James G. March and Herbert A. Simon, Organisations, (New York: John Wiley and Sons 1958).

92 In recent years a shift has slowly occurred, however .A report from McKinsey indicates that cybersecurity has now become much more of a ‘CEO-level issue’. Tucker Bailey, James Kaplan, and Chris Rezek, ‘Why senior leaders are the front line against cyberattacks’, McKinsey Insights, Jun. 2014, <http://www.mckinsey.com/insights/business_technology/why_senior_leaders_are_the_front_line_against_cyberattacks>.

93 See, for example: RSA, ‘Cybersecurity Poverty Index’, 2015, <https://www.emc.com/collateral/ebook/rsa-cybersecurity-poverty-index-ebook.pdf>; Booz Allen Hamilton and The Economist Intelligence Unit, ‘Cyber Power Index: Findings and Methodology’, 2011, <http://www.boozallen.com/media/file/Cyber_Power_Index_Findings_and_Methodology.pdf>; United Nations Institute for Disarmament Research, ‘The Cyber Index: International Security Trends and Realities’, United Nations Publications, 2013, <http://www.unidir.org/files/publications/pdfs/cyber-index-2013-en-463.pdf>.

94 Inherently, what is a ‘curse’ to the offensive actor might be a ‘blessing’ to the defender.

95 This dynamic might even exist within a state as government institutions, with different organisational missions, might be developing separate offensive cyber capability programs.

96 According to Verizon’s 2015 Data Breach Investigations Report, more than 317 million new pieces of malicious software were created last year. That is about ten new pieces of malware each second of every day. Verizon, ‘Data Breach Investigations Report’, 2015, <http://www.verizonenterprise.com/DBIR/>.

97 Boldizsár Bencsáth, ‘Duqu, Flame, Gauss: Followers of Stuxnet’, RSA Conference Europe 2012, <http://www.rsaconference.com/writable/presentations/file_upload/br-208_bencsath.pdf>.

98 In the case of Fanny and Stuxnet see: Kaspersky Lab’s Global Research & Analysis Team, ‘A Fanny Equation: “I am your father, Stuxnet”’, Securelist, Feb. 2015, <https://securelist.com/blog/research/68787/a-fanny-equation-i-am-your-father-stuxnet/>.

Additional information

Notes on contributors

Max Smeets

Max Smeets is a lecturer in Politics at Keble College, University of Oxford, and a D.Phil Candidate in International Relations at St. John’s College, University of Oxford. He was previously a visiting research scholar at Columbia University SIPA and Sciences Po CERI. Max’ current research focuses on the proliferation of cyberweapons. More at: http://maxsmeets.com

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.