2,583
Views
3
CrossRef citations to date
0
Altmetric
Review Article

Cybersecurity For Defense Economists

Pages 705-725 | Received 05 Jul 2022, Accepted 18 Oct 2022, Published online: 04 Nov 2022

ABSTRACT

Cybersecurity plays a role in national security. This study introduces cybersecurity concepts in ways familiar to defense economists and identifies parallel methods of analysis in the fields. The theoretical tools of both fields include microeconomics and game theory. These tools enable analyses of phenomena present in both milieus: public goods, externalities, commons, incentives, interdependent security, platform economics, and inefficiency of decentralized decision making. Additional topics include cyber war, cyberterrorism, deterrence and disinformation in cyberspace, price of anarchy, and economics of cryptography.

Some of the challenges blocking the way to cyber peace are technical, but most, at their heart, are economic. – Clarke and Knake (Citation2019)

Introduction

Cybersecurity plays a role in national security. It is an area of defense economics expected to grow in coming years (Gaibulloev, Kollias, and Solomon Citation2020). Defense economics is a subfield of public economics and, as it turns out, so is cybersecurity. Information security has both private and public good characteristics (Bauer and Van Eeten Citation2011). The CIA properties of information systems (confidentiality, integrity, and availability) are public goods for authorized users.Footnote1 Indeed, owing to considerations such as market structure, information asymmetries, incentives, and externalities, economic context makes cybersecurity hard (Anderson Citation2001). This study introduces cybersecurity concepts in ways familiar to defense economists and identifies parallel methods of analysis in the fields.

From here on I consider cybersecurity to be synonymous with information security as it pertains to the protection of information and communication technology (ICT) and interactions of adversaries in the cyber domain battlespace (cyberspace).Footnote2 Rick Howard’s (Citation2020) definition of cybersecurity is used throughout:

Cybersecurity reduces the probability of material impact to an individual or organization due to a cyber event.

Howard emphasizes this definition encompasses (i) preventing cyber campaigns, not just breaches; (ii) reducing the probability of occurrence, not eliminating them; and (iii) the consequences to be avoided must be material; i.e. sufficiently disruptive rather than merely embarrassing. While terrorism studies usually begin with a definition of terrorism, this is not often the case for cybersecurity. Howard’s definition has the advantage that it stems from industry first-principles based on objectives of working cybersecurity professionals and also cybersecurity researchers. For example, it agrees with Littlewood et al.’s (Citation1993) early appeal for a, ‘probability-based framework for operational security measurement.’ By contrast, a standard cybersecurity metric is the Common Vulnerability Scoring System (CVSS), ranging from zero (none) to 10 (critical).

Part of the rationale for focusing on campaigns rather than breaches is the existence of advanced persistent threats (APTs) in cyberspace. APTs are nation-state threat actors possessing resources and specialized skills to tailor methods for patiently probing specific targets for weaknesses; gain and maintain unauthorized access to systems in order to return at a later date; and compromise cryptographic protocols.Footnote3 Security analyst Sean Carpenter at Sandia National Laboratories coined the term APT upon discovering China’s ‘Titan Rain’ campaign of espionage, reconnaissance, and exfiltration of data of targeted nations and private enterprises over multiple years during the mid-2000’s (Rose Citation2013; DiMaggio Citation2022).

Nations directing APTs include the Five Eyes Group (Australia, Canada, New Zealand, UK, and U.S.), China, Colombia, France, Georgia, India, Iran, Israel, North Korea, Pakistan, Russia, South Korea, Turkey, and Vietnam. Just as one nation’s terrorists may be another nation’s freedom fighters, individual nations on this list are unlikely to agree they or their allies sponsor APTs. Lemay et al. (Citation2018), Crowdstrike (Citation2022), and DiMaggio (Citation2022) provide surveys of these nations’ APT activities. APT targets not only include government entities and critical infrastructure, but also commercial firms. For example, Lockheed formulated its kill chain cybersecurity method in response to an APT intrusion, and another APT conducted the massive 2017 Equifax breach. In these cases, the suspected Chinese APTs obtained intellectual property (IP) and personally identifiable information (PII). In addition to APTs, malicious actors in cyberspace include terrorists, industrial spies, (organized) eCrime, insiders, hacktivists, and hackers. From this perspective, cybersecurity tactics involve defensive measures to reduce the probability of occurrence of a material event or offensive actions to increase its likelihood.

Anderson and Moore (Citation2006) make an early case for game theory and microeconomic theory being just as important as the mathematics of cryptography to security engineers. This paper presents commonalities in the uses of game-theoretic and microeconomic tools in cybersecurity and defense economics. The second section begins with a discussion of cyberspace and modes of conflict therewithin. This includes cyber war, cyberterrorism, deterrence, and disinformation in cyberspace. Given the presence of externalities, market failures also occur in cyberspace. The public nature of cybersecurity is a theme of the third section, as is the microeconomics of cybersecurity investment. Since much of cyberspace is organized around markets, I also discuss how (platform) market structure matters for cybersecurity. The fourth section covers the game theoretics of cybersecurity, including interdependent security and the price of anarchy. The economic implications of cryptography receive attention in the fifth section. The final section includes brief concluding remarks and suggestions for future research.

Cyber Conflict

In this section I discuss environments where cyber conflict takes place and four modes of cyber conflict: cyber war, cyberterrorism, deterrence, and disinformation.

Battlespace

“Cyber-“ is a prefix standing for computer and electronic spectrum-related activities (Lehto Citation2018). Cyberspace is the newest addition to the traditional battlespaces (domains) of land, sea, air, and space. U.S. National Security Presidential Directive 54 (2008) (Homeland Security Presidential Directive 23 (2008)) defines cyberspace as

the interdependent network of information technology infrastructures, and includes the Internet, telecommunications networks, computer systems, and imbedded processors and controls in critical industries.Footnote4

By contrast, the Internet is a global network of computer networks. The World Wide Web is an example of an Internet service. Blockchains can also operate on the Internet while bypassing the World Wide Web (Mougayar Citation2016). Hence, cyber events need not occur on the World Wide Web or Internet. Stuxnet (Operation Olympic Games) is perhaps the most famous example, sabotaging Iranian nuclear centrifuges within an air-gapped closed network (Chien Citation2010; Chen Citation2010; Zettner Citation2011). The use of nuclear weapons to affect the electromagnetic spectrum also fits within this battlespace.

The cyberspace battlespace consists of a system of networks, electronic devices, electromagnetic spectrum, and other physical bases (Chen, Ma, and Wei Citation2013). Cyberspace is concerned with the creation, collection, transformation, communication, and use of information. The physical dimension includes hardware and infrastructure. The network dimension has an ecology with topological and spatial characteristics facilitating the ability to perform actions from a distance, thereby relaxing physical constraints. The social dimension involves heterogeneous users and organizations. Finally, the time dimension of cyberspace can be instantaneous when autonomous systems are involved; facilitates real-time coordination of assets; and is also evolutionary in that hardware, software, and network structures are constantly reconfiguring, evolving, and growing. Cyberspace is decentralized, dispersed, and dynamic.

As is the case for the other four battlespaces, cyberspace includes physical entities; but unlike the other battlespaces, the domain is man-made. It is also the domain all other domains depend upon. Physical effects on the battlespace include

  • Infiltration and intrusion.

  • Reconnaissance of networks and systems.

  • Communications disruption.

  • Isolating targets from accessing cyberspace or power grids.

  • Multiple simultaneous operations.

At the same time, the effects of cyberattacks are often not physical, involving

  • Denial of service (DoS).

  • Deleted, manipulated, ransomed, or stolen data.

  • Election interference and political hacking.

  • Intellectual property theft.

  • Dissemination of (mis)information and narrative manipulation.

  • Espionage and surveillance.

  • Cyber blockades of financial channels.

Consequently, cyberspace facilitates influencing an opponent’s environment by means of using information as a weapon rather than military force (Deakin Citation2010). Moreover, what many cyber tactics have in common is they involve 0’s and 1’s (e.g. programming code and data). As it is rarely the case someone puts their life on the line to write code, manipulate data, or troll social networks, this involves a new kind of warrior.

The structure of networks makes them susceptible to indirect infiltration, propagation that may lead to class breaks, and unintended spillovers. The most prevalent method of indirect infiltration is social engineering: actions influencing a person to take an action that may or may not be in their best interest (Hadnagy Citation2018). Malicious social engineering is hacking humans in order to gain access, gather credentials, or intelligence for further attacks. The baseline example of social engineering is phishing through emails. Vishing is phishing over the phone (voice). Smishing is text-based (SMS) phishing. Social engineering can also involve physically impersonating a trusted individual such as an authority figure, law enforcement, or IT staff. Online catfishing is an example of virtual impersonation. At the other end of the spectrum is remote code execution, where malicious actors load malware at arm’s length. An example is the 2021 Log4Shell vulnerability in open-source Apache webserver software. A malicious actor can send a URL to Apache’s log of activity (known as Log4j) and the log executes the malicious code at the URL. An example of a class break is the National Security Agency’s EternalBlue exploit, allowing the NSA to obtain information from any Windows 8 or older machine. The Shadow Brokers group illegally obtained the code and encrypted it. Unable to sell the key via an all-pay Bitcoin auction, the group publicly released the key, allowing anyone to decrypt it. Subsequently, APT ransomwares Wannacry and NotPetya used EternalBlue code to globally propagate from Windows machines lacking the EternalBlue patch.Footnote5

Finally, the collection of potential areas of vulnerability of an entity in cyberspace is known as the attack surface. An attack surface represents any known, unknown, or potential vulnerabilities across certain main areas of exposure – software, hardware, network, and users (Ayrour, Raji, and Nassar Citation2018). Previously unknown vulnerabilities are called zero-days. To date there is no canonical definition of attack surface, nor agreement that the concept is limited to software or includes hardware and users (Theisen et al. Citation2018). These authors suggest researchers and practitioners choose a definition closely matching their domain of analysis. An appropriate definition for the defense economics domain is

The union of an ICT system’s access points, components, resources, and users vulnerable to material impact by unauthorized malicious actors.

This union includes a system’s architecture and components, channels, code, controls, data, interfaces, platforms, and protocols. The attack surface accounts for all possible ways a malicious actor can attack a system. Traditionally, defenders seek to reduce the attack surface via defense-in-depth (multiple layers of security) and prepare kill chains for potential intrusions (Hutchins, Cloppert, and Amin Citation2011). Indeed, preparing kill chains for the lifecycle of known Chinese and Russian APTs is becoming widespread as their tactics, techniques, and procedures (TTPs) are documented.Footnote6 More recently, the zero-trust cybersecurity paradigm requires every device, user, and network flow be authenticated and authorized irrespective of its location because the network itself is assumed to be hostile. The associated control plan turns defense-in-depth inside out in that once authorizing a request the data plane is dynamically configured so it accepts traffic from that client (user and device) and only that client (Gilman and Barth Citation2017). It can be operationalized using a ‘software defined perimeter,’ where access is restricted on a need-to-know basis for every verified, validated, and authenticated user, device, and application.

Cyber War

Stiennon (Citation2015) channels Clausewitz in providing the following definition of cyber warfare:

Cyber warfare is an extension of policy by actions taken in cyberspace by state actors (or by non-state actors with significant state direction or support) that constitute a serious threat to another state’s security, or an action of the same nature taken in response to a serious threat to a state’s security (actual or perceived).

The advent of cyber war is generally attributed to distributed denial of service (DDoS) attacks via botnets by Russia’s Main Intelligence Directorate (GRU) on critical infrastructure in Estonia and Georgia in 2007 and 2008; while 2010’s Stuxnet is regarded as the first cyber weapon.Footnote7 This is consistent with Arquilla and Ronfeldt’s (Citation1993) argument that the information revolution facilitates turning knowledge into capability by knowing everything about an adversary while keeping the adversary from knowing much about oneself, with the result being networks fighting networks.

Yet these attacks immediately produced exceptions to the untested conventional wisdom on the nature of cyber war. For example, cyber war is often couched as asymmetric, whereby cyber prowess can counterbalance military inferiority. lists APT sponsoring hosts alongside their rank in total military expenditures. With the exception of Georgia, it is top-heavy. The low cost and commercial availability of cyberphysical resources does not sufficiently level the playing field because exploits such as zero-days require significant organizational capacity, dedication, skills, experience, and ingenuity. Or enough cryptocurrency to buy zero-days on C2C markets, which includes some large eCrime ransomware outfits. Profit motive need not be a sufficient substitute; the mission-driven NSA is known to generate cryptographic techniques long before commercial enterprises and academia.Footnote8 At the same time, an unsophisticated cyberattack can be effective if targets are susceptible to n-days rather than zero-days (e.g. outdated Windows XP and Vista PCs and NotPetya or PCs lacking name-brand antivirus software and WannaCry).

Table 1. APT sponsors by total military expenditure.

Moreover, the fog of attribution in cyberspace was a mirage. This is partially because of the political motivation of the attacks. The 2007 attack in Estonia coincides with the controversial relocation of a Russian Red Army World War II monument. GRU attacks on Georgia coincide with hostilities with Russia over the status of two northern regions, and Georgian efforts to connect to the Internet through Europe rather than Russia. Political context matters. Indeed, the GRU’s cyber target selection is consistent with past political and information war doctrines of monitoring, neutralizing, and countering actions endangering Russian military security (Booz Allen Hamilton Citation2020).

Apart from the battlespace, cyber war has several additional novel attributes. First, intelligence agencies are major actors in implementing cyber war hostilities. This is in addition to the privileged role cyberspace plays in modern military intelligence (Gilad, Pech, and Tishler Citation2021). A partial reason is cyber war and its doctrines originate from information warfare, which is the provenance of intelligence agencies. Following Lindsay and Gartzke (Citation2022), cyber war offers unprecedented opportunities for secrecy, deception, obfuscation, manipulation, and subversion. It is an intelligence-counterintelligence contest, conducted with new means at a distance on a global scale.

Second, much debate exists about whether Russia and China explicitly coordinate with organized eCrime as force multipliers. For example, the 2021 ransomware hack on Colonial Pipeline that crippled fuel delivery in the southeastern U.S. is attributed to Russian eCrime group Darkside (rebranded as BlackMatter). The episode further popularizes the notion of privateering in cyberspace, whereby eCrime organizations refrain from activity against entities within their haven nation. Moreover, the Ukrainian invasion spurred coordination between Russian APTs and eCrime. The analogy does not quite fit; however, as the haven nation’s sovereignty precludes prosecution, whereas privateers are accessible to enemy retribution on the high seas. In addition, states acknowledge privateers, whereas states maintain plausibly deniable associations with eCrime surrogates such as the Russian Business Network’s activities in Estonia. These relationships are more akin to the paid riding in haven nations such as Cyprus, France, Greece, and Italy that pursued accommodation with terrorist groups and looked the other way with respect to their activities abroad (Lee Citation1988; Lee and Sandler Citation1989). The understanding of paid riding is an area defense economics can contribute to cybersecurity.

Third, from a cost and technical perspective offensive tactics have a decided advantage in cyber war. The attack surface is multifaceted and difficult to defend. APTs stockpile zero-days and are quick to take advantage of them once known (as substitutes for attributable TTPs). The Shadow Brokers hacked into a rich cache of NSA zero-days. The 2021 discovery of the zero-day exploit of cloud middleware supplier SolarWinds is another example. While the hack is attributed to a Russian APT, Chinese APTs were quick to take advantage. Moreover, both Chinese and Russian APTs took advantage of Apache’s log4shell vulnerability upon its discovery and publication by an Alibaba Cloud analyst. The Chinese government took the further step of fining Alibaba Cloud for notifying Apache about the vulnerability. Consequently, several months later when a subsidiary of Alibaba Cloud found a remote code execution vulnerability in Java-based middleware Spring4Shell, it gave no public notice. Instead, knowledge of the vulnerability came from observing patching behavior of Chinese firms.

Fourth, the presence of APTs means (low-level) cyber war is a daily experience for targets. In this respect cyberspace differs from other battlespaces. For example, cyber war is undeclared. Moreover, some APTs distinguish between military, government, and private sector targets while others do not. Cyber war is a form of peacetime hostilities designed to avoid military conflict while conducting reconnaissance and influencing sociopolitical tensions. Yet Clarke and Knake (Citation2019) opine recent and current levels, pace, and scope of disruptive activity in cyberspace by military units (APTs) of several nations is unprecedented, dangerous, and unsustainable in ‘peacetime.’

Lastly, the cyber component of Russia’s 2022 invasion of Ukraine involves new ad-hoc institutional arrangements. Russia coordinates with organized eCrime and nominal hacktivist groups as force multipliers. The Ukrainian ‘IT Army’ is a decentralized informal alliance that is both military and civilian, public and private, domestic and international (Soesanto Citation2022). It involves large public firms such as Google, winner of the inaugural Ukrainian Peace Prize, and individual participants in DDoS attacks on Russian targets coordinated through the IT Army’s Telegram channel. It remains to be seen whether this accounts for the initial lack of success of Russian cyber offensives against Ukraine, or Russia instead pulled its punches in expectation of a quick victory and use of Ukraine’s critical infrastructure for its own purposes or in recognition of APTs within Russian critical infrastructure and the threat of reprisals. What is true is alliances need no longer be based on shared geography but rather shared public opinion, which is both ethereal and manipulatable. Now nations need plans to marshal similar support in the future via cyber calls to arms or counter such support on the part of potential adversaries. An early example is U.S. Senator Joe Lieberman’s 2010 call for Amazon to stop hosting WikiLeaks and for credit card companies to stop processing WikiLeaks donations. It also raises ethical issues as to whether offensive nonmilitary participation in a cyber war can be justified on the basis of public opinion and if participants are exposing themselves to retribution. To wit, one country’s cyber mercenaries are another country’s volunteer IT Army. This requires a new set of international norms for war.

Cyberterrorism

In its infancy, the terrorism literature was dominated by articles defining terrorism. A unifying theme among them is the requisite use of violence or threat of violence in the form of physical harm to noncombatants or critical infrastructure. This poses a problem for the consideration and definition of cyberterrorism because even acts of cyber war need not involve violence. Consequently, much of the cyberterrorism literature remains mired in the definitional debate. The most widely-cited definition of cyberterrorism is Denning (Citation2000):

Unlawful attacks and threats of attack against computers, networks, and the information stored therewith when done to intimidate or coerce a government or its people in furtherance of political or social objectives. Further, to qualify as cyberterrorism, an attack should result in violence against persons or property, or at least cause enough harm to generate fear. Attacks that lead to death or bodily injury, explosions, plane crashes, water contamination, or severe economic loss would be examples. Serious attacks against critical nonessential services that are mainly a costly nuisance would not.

Gordon and Ford (Citation2002) add targets alone should not be the qualifier putting the ‘cyber’ in cyberterrorism; abilities and tools of the virtual world need to be leveraged as part of the attack or threat. For this reason, online training for conventional terror attacks does not qualify. Retired FBI agent William Tafoya (Citation2011) sees the delimiters between cyber warfare and cyberterrorism as context and targets, not technological tools or frequency of attacks: ‘cyberterrorism indiscriminately targets critical infrastructure and civilians – the innocent.’

Defining cyberterrorism is both a moot point and problematic. As of 2015, there was no recorded instance of cyberterrorism in the U.S. (Klein Citation2015). In her testimony, Denning (Citation2000) acknowledges she cannot identify any instance that fits the bill. Gordon and Ford (Citation2002) do not provide any examples. Nor are any examples given in a recent survey of definitions of cyberterrorism (Plotnick and Slay Citation2021). Or it may be there is too much grey area relative to cyber war, cybercrime, and hacktivism. For example, hacktivists often have political, religious, or other non-monetary motivation (DiMaggio Citation2022). Nevertheless, the U.S. Treasury maintains the terrorism risk insurance act (TRIA) applies to cyber liability (U.S. Treasury Department Citation2016). Most standalone cyber insurance policies have a clause indicating the policy covers incidents caused by cyberterrorism. The lack of consensus on a definition of cyberterrorism and clear examples is therefore problematic for practical reasons.

Deterrence

Defense economists differentiate between deterrence by denial via defensive measures that increase the cost of an attack or defeat an attack versus deterrence by the credible threat of punishment. Both forms of deterrence ultimately target an adversary’s cost/benefit calculus. Cyberspace adds three variations to classic deterrence theory: (i) the inability to distinguish between defensive actions to preserve the status quo versus offensive actions to marginally adjust the distribution of power because similar methods can be used both offensively and defensively; (ii) the attribution problem; and (iii) the constant state of engagement.

The extant literature primarily focuses on the attribution problem. For example,

North Korea likely views cyber as a cost-effective, asymmetric, deniable tool it can employ with little risk from reprisal attacks, in part because its networks are largely separated from the Internet and any disruption of Internet access would have minimal impact on its economy (Lundbohm Citation2017).

Is the attribution problem all that bad? The concept of plausible deniability predates cyberspace. From a technical perspective, Bitcoin was viewed as anonymous until academic researchers Meiklejohn, Pomarole, and Jordan et al. (Citation2013) proved otherwise. In the same way, attribution in cyberspace is not limited to the public sector; academics and cybersecurity firms establish reputations and drum up business via attribution efforts. Owing to more and more data being collected about network activities, attribution is one area of cybersecurity that favors defense (Baker Citation2012). In order to keep its sources and methods proprietary, cybersecurity elements in the U.S. government often pass technical evidence to the Cyber Threat Alliance (CTA) of commercial security practitioners, who confirm attribution using their own product lines. Public non-state attribution enhances credibility.

The political context matters for attribution as well. Little doubt exists North Korea mounted the 2014 hack on Sony Corporation for its depiction of their leader in the film, The Interview. Sony subsequently mounted DDoS attacks on websites posting its internal documents and forthcoming films procured from the hack.

Consider the economic consequences put on the premium for not getting caught. The variable costs of one-shot highly-tailored difficult-to-attribute exploits must be extremely high because code can be compared for similarities as can techniques of intrusion and deception. Similarities between Stuxnet, Flame, Duqu, Gauss, and Equation tie U.S. APTs with Israeli ones. Cyber tradecraft lends itself to attribution.

By contrast, high value targets have an ex-ante incentive to invest heavily in attribution technology, thereby making them a fixed cost for the deployment decision. One objective of defense in cyberspace is to raise the bar high enough to make attribution more feasible. Gilad, Pech, and Tishler (Citation2021) analyze a game-theoretic model of APTs with traceable attacks and targets whose type depends on their degree of vigilance. This restores some of the offense-defense balance in cyberspace (Lindsay Citation2015).Footnote9 Classic deterrence issues (cost of retaliation and risks of escalation and blowback) then come into play.

To the extent attribution presents a problem, it also works against deterrence via cyber retaliation because the threat must be must visible and punishment attributable and both reveal cyber tradecraft, making repeated use difficult. A zero day that is a secret cannot function as a deterrent and if it is known it is no longer a zero day. ‘Name and shame’ is a prevalent but weak form of deterrence. So is criminal prosecution; many foreign nationals on the FBI’s Cyber’s Most Wanted list are known APT employees but they will never be prosecuted unless they enter the U.S. At the same time, the FBI’s list and Justice Department’s indictment of APT employees is coercive in it signals the U.S.’s ability to trace and attribute cyberattacks conducted against it and its allies (Wilner Citation2020).

Furthermore, cyberspace is perpetually contested on a continuum of actions. Deterrence by denial is a moving target. As for threatening punishment, in contrast to nuclear deterrence, there is no tacitly agreed upon red line of demarcation for egregiousness of exploits to reach the level of a national security concern. An indicator of the difficulty of deterrence-by-punishment is the 2014 U.S. Defense Science Board’s suggestion the term be replaced by deterrence-by-cost-imposition, which can include kinetic/military, diplomatic, economic, law enforcement, informational, and digital forms of response to a cyberattack (Wilner Citation2020). Hence, cyber deterrence-by-cost-imposition requires a menu of consequences not limited to the cyber domain. An example is law and warfare hybrid ‘lawfare,’ such as U.S. Department of Justice’s authorization of the preemptive takedown of GRU APT Sandworm’s ‘Cyclops Blink’ two-tiered botnet in April 2022.

Disinformation in Cyberspace

Disinformation is a term invented by the KGB (dezinformatsiya) as a more intriguing substitute for what practitioners call ‘active measures.’ Following Whaley’s (Citation1982) classic taxonomy, disinformation involves hiding the real and showing the false. It is a form of offensive counterintelligence via deception and neutralization in order to gaslight an adversary or create further fractures in existing divisions. By contrast, the CIA triad (confidentiality, integrity, accessibility) is not a particularly useful framework for addressing the effects of disinformation (Stewart Citation2021). Rid (Citation2020) argues the digital revolution has fundamentally altered the disinformation game, making active measures more active and less measured; more scalable, harder to predict, and harder to control once launched. That is, the digital revolution facilitates netwar in the sense of Arquilla and Ronfeldt (Citation1993).

Cyberspace further activates deception owing to

  • Its viral and vast reach.

  • Social media is the most commonly used channel for disseminating disinformation.

  • Both progressive and conservative news consumers rely more on like-minded and polarizing social media rather than traditional journalism outlets, increasing disinformation’s sway.

  • Legitimate social media platforms and ISPs are often protected from legal liability stemming from content posted by users (e.g. section 230 of the U.S. Telecommunications Act).

  • Algorithmically-catered social media feeds create echo chambers where truth is a matter of social consensus rather than based on facts and evidence.

  • Digital deception lacks physical cues for cognitive detection by humans.

  • Digital deception can also affect machines (e.g. Stuxnet, Internet of Things).

Cyberspace further activates neutralization owing to

  • Broad-based Internet censorship via blackouts, throttling, and nationwide firewalls.

  • Censorship is unnecessary if the truth can be obviated via the five Ds: dismiss the critic, distort the facts, distract from the main issue, dismay the audience, and divide and conquer.

  • The onion router (TOR) provides platforms for anonymous dissent to circumvent censorship.

Whaley’s (Citation1982) taxonomy of deception requires extension to include examples of showing the real and hiding the false in addition to hiding the real and showing the false. Indeed, the CIA discovered disinformation works best when phony outlets carry factual content – when the source is fake but the content accurate (Rid Citation2020). On the other hand, a honeypot (fake digital asset) is a cyber example of hiding the false in order to convince malicious actors of its legitimacy. presents a categorization of disinformation, including historical examples and recent ones from the information age. A main takeaway is the half-life of secrets is declining precipitously.

Table 2. Disinformation active measures.

Microeconomics of Cybersecurity

Three primary applications of microeconomics to cybersecurity are information and communication technology (ICT) platforms, the public-goods nature of cybersecurity and cyberspace, and cost/benefit analyses of cybersecurity investments.

ICT Platforms

The private sector owns and operates much of cyberspace. Cyberspace is also used to control and communicate with critical infrastructure, which again is mostly owned and operated by the private sector. Virtual vulnerabilities in critical infrastructure have kinetic consequences. Moreover, Moore’s law ensures cyberspace technology advances rapidly in the private sector as compared to glacial procurement and regulatory processes in the public sector. Information systems platforms are therefore an economically important example of a private-sector cybersecurity battlespace.

Information systems platforms are technological architectures or ecosystems connecting users with developers of complementary products or services (Arce Citation2020). ICT platforms reduce search costs for users and increase transaction opportunities for developers. Platforms are also known as two-sided markets. Examples include the Windows operating system for PCs/laptops and Apple’s OS for Macs, along with Android and iOS for tablets and mobile devices. In this case, users buy apps developed for the platform by third parties and the platform itself. Similarly, Intel, ARM, and AMD microprocessors are competing hardware platforms for PC systems. A further example is cloud computing, particularly Software-as-a-Service, in which applications are made available on demand to users via the Internet. As Dunne and Sköns (Citation2021) observe, cloud computing is expected to have a significant impact on the warfighting operations of armed forces thereby making cloud platforms a challenge for the defense industry because their size and influence dwarf the established defense industrial base.

The defining feature of a platform is that at least one side derives a benefit (indirect externality) from the number of participants on the other side. App developers prefer platforms with many users and users prefer platforms with many apps. Platforms can also exhibit same-side (direct or network) externalities when members on one side derive a benefit from the number of participants on the same side. Such network externalities arise when compatibility is an issue and are also the fundamental value derived from social networks. Metcalfe’s Law states the total (system) value of the network externality is proportional to the square of the number of participants, n2. Intuitively, each of the n participants can interact with n1 other participants nn1n2. As this requires each participant to be equally interested in interacting with every other participant, Briscoe, Odlyzko, and Tilly (Citation2006) instead contend the value scales at nlogb(n), where base b falls within the range 1 < b ≤ 10 and is often taken to be e in empirical studies; i.e. total network value is nlnn. Their idea is that in a large network an individual will only interact with a smaller subset, logbn, of the larger membership, n.

Platforms bring a well-defined structure to the battlespace. The economic structure of platforms has yet to attract requisite attention from cybersecurity researchers. Similarly, the platform economics literature rarely addresses cybersecurity. Exceptions include Sen, Guerin, and Hosanagar (Citation2011), Garcia, Sun, and Shen (Citation2014), Arce (Citation2018), Arce (Citation2020), Jegers and Van Hove (Citation2020), and Sen, Verma, and Heim (Citation2020). These studies facilitate an introduction to results in the vein of the

economic structure ⇔ cybersecurity

paradigm. Economic structure (market structure, information asymmetries, strategic interdependence, (in)direct externalities, etc.) has implications for cybersecurity and cybersecurity has implications for economic structure. An early example is Anderson’s (Citation2001) conjecture that platform monopolies have little incentive to invest in cybersecurity because security gets in developers’ way and platforms need developers in order to overcome the chicken-versus-egg problem and obtain a critical mass of users to support the platform and developers. As much of platform economics is steeped in platform-as-monopoly, this perhaps explains the absence of cybersecurity concerns in platform economics.

At the same time, one cannot argue with the reality that most ICT platforms are not monopolies. For example, Musin (Citation2021) derives the Linda index of the concentration of core providers of Infrastructure-as-a-Service in the cloud, showing the market leader (Amazon) cannot be considered a monopoly but four providers (Amazon, Microsoft, Alibaba, and Google) dominate the global market, both in terms of market share and capture of market growth. Indeed, platforms may be the one nonidealized example where the economic construct of monopolistic competition actually applies (Kreps Citation2019). To this end, cybersecurity has been shown to facilitate non-monopolistic platform competition. Arce (Citation2018) derives a platform’s relative market share as equal to the square root of the platform’s relative insecurity. Moreover, not all malicious actors have the resources of APTs. Market share matters because a larger relative market share attracts more attention from malicious actors (Vasek, Wadleigh, and Moore Citation2016; Geer, Jardine, and Leverett Citation2020). In addition, Arce (Citation2020) shows how constraints exist on cloud services providers’ (CSPs’) level of insecurity in order to ensure users do not switch to another CSP. Hence, cybersecurity must meet economic criteria in order to preclude a monopolistic outcome. Specifically, cybersecurity figures into a platform’s no-switching constraint and also facilitates lock-in. Sen, Verma, and Heim (Citation2020) characterize how cybersecurity in software platforms determines whether such markets are competitive or monopolistic.

Public Economics and Cybersecurity

The confidentiality, integrity, and availability of an ICT system are public goods for its users. Similarly, we all benefit when cybersecurity ensures our critical infrastructure functions properly. By contrast, the 2021 cyberattack on an Oldsmar, Florida water treatment plant via remote access to a mouse cursor was a wakeup call for the critical infrastructure community. As another example, CSPs and their users function under a shared security umbrella (Clement and Arce Citation2022). CSP services include infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS), and software-as-service (SaaS). Note the term ‘platform’ as it pertains to platform economics applies equally well to IaaS and SaaS. CSP and user contributions to security enable users to generate private benefits from CSP services, and CSPs privately benefit from the subscription and on-demand fees they charge users. CSP cybersecurity is therefore an impure public good providing heterogenous selective incentives for the CSP and its users. Another example is credit cards. Credit cards are an ICT platform and issuers and cardholders alike derive public and private benefits from card security. Yet the costs of card insecurity are asymmetric; cardholders are rarely held to account in the event of false charges or identity theft.

This discussion illustrates why Varian (Citation2004), Powell (Citation2005), and Bauer and Van Eeten (Citation2011) place cybersecurity squarely within the constructs of public economics. In what follows, I discuss cybersecurity within the context of public goods, externalities, and the commons.

With respect to public goods, Varian (Citation2004) cites defense economists Hirshleifer (Citation1983) and Arce and Sandler (Citation2001) when noting the contribution aggregator construct for public goods (summation, best-shot, weakest-link, etc.) facilitates an analysis of free riding within a cybersecurity context. He addresses the Olsonian issue of security provision as the number of users increases – clearly a topic of interest within a network framework – and Hirshleifer’s program of comparing the efficiency of contribution aggregators-cum-network reliability in simultaneous versus sequential games. A novel advance is Varian’s analysis of endogenous adversarial behavior when reliability is determined by defenders’ aggregation technology. Here, Varian finds the larger group of adversaries has the advantage when the aggregator is best-shot and the smaller group has the advantage under weakest-link.

An alternative to viewing cybersecurity as a public good with selective incentives is a Coasian viewpoint of cybersecurity as a private good with externalities. An example of a positive externality is when one’s own security prevents malware propagation. A firewall diverting attacks elsewhere is an example of a negative externality. Another example is the use of simple passwords and shortcuts to bypass the costs associated with IT staff mandates to change passwords frequently and employ two-factor authentication. As is usually the case, the positive or negative externality is not internalized by individuals making cybersecurity decisions.

Land, sea, air, and space have characteristics making them akin to commons. Cyberspace is no exception; it is ubiquitous and yet no single entity can control or dominate it. The many Internet of Things (IoT) devices are a common pool resource for botnets, with the Mirai botnet DDoS attacks being the quintessential example (Adaros Boye, Kearney, and Josephs Citation2018).Footnote10 IoT device infections increased significantly in 2016 owing to Mirai’s compromise of devices running on their vender-specified default passwords. One potential solution is a social norm whereby IoT users change a device’s password immediately after it is out of the box. Yet given humans’ limited mental storage capacity, Preibusch and Bonneau (Citation2010) argue increasing demands for passwords and frequent changing of passwords treat users’ mental storage capacity as a commons, thereby leading to weak passwords and sharing of passwords across devices, sites, and apps. This tragedy of the commons decreases cybersecurity. Finally, Herley and Florêncio (Citation2008) use tragedy of the commons reasoning and modeling to demonstrate phishing for profit (rather than spear phishing for information) cannot possibly earn returns above a phisher’s opportunity cost, and check their findings against publicly available data sources. Consequently, phishing should be regarded as a low-skill low-reward enterprise.Footnote11

Partial Equilibrium Analyses

Just as Becker’s (Citation1962) seminal partial equilibrium analysis lies at the foundation of the economic analysis of law enforcement (and, by extension, counterterrorism), Gordon and Loeb (Citation2002) provide partial equilibrium underpinnings for the economic analysis of cybersecurity investment. Gordon and Loeb introduce the concept of a security breach probability function, whereby the likelihood of a breach is increasing in the vulnerability of an asset and decreasing in cybersecurity investment. For the functional forms considered, they show diminishing returns to protecting highly vulnerability assets work against protecting such assets to the point of expected loss. Instead, protecting assets with midrange vulnerability is the economically rational course of action. Moreover, any expenditure on cybersecurity should be a fraction of the expected loss. Their analysis has been extended in myriad directions; for example, by considering alternative forms of breach function (Hausken Citation2006) and security externalities (Gordon et al. Citation2015). Indeed, it has spawned an entire industry on cost/benefit analyses of cybersecurity investments (e.g. Gordon and Loeb Citation2006; Böhme Citation2010; Beissel Citation2016).

Depending on the functional form of the breach function, some highly vulnerable targets go unprotected, whereas another possibility is ‘bang-bang’ protection where low vulnerability targets receive zero protection and higher vulnerability targets receive maximum protection. In assessing this literature from the perspective of defense economics, one is struck how the conclusions are at odds with Dresher’s (Citation1961) ‘no soft-spots’ principle for Colonel Blotto games with multiple targets. Under no soft-spots, investments are proportionally made so each protected target becomes equally attractive (in expectation) to the enemy.Footnote12 Furthermore, undefended targets are of lesser value to the enemy. Consequently, the enemy only attacks those targets that are defended. The difference is, of course, Dresher’s analysis takes into account the incentives and (re)actions of the opponent. By contrast, neither Becker’s probability of conviction nor Gordon and Loeb’s breach function depend upon actions of malicious actors.

Moreover, in the same way terrorists substitute tactics in reaction to counterterror tactics (Enders and Sandler Citation1993), malicious actors can compromise multiple components of the attack surface at one time and alter tactics to select the most promising attack (Mussman and Turner Citation2018). For example, in their 2022 Global Threat Report, Crowdstrike currently estimates 62% of attacks do not involve malware so as to evade detection by antimalware products and automated machine learning. Instead, more attacks involve credential harvesting and remote code execution. In addition, Böhm and Moore (Citation2016) analyze a variation on this theme from the defender’s perspective, whereby a defender’s uncertainty about an attacker’s costs of breaching alternative components of the attack surface leads the defender to initially underinvest in protection in order to learn and adapt to the attacker’s modus operandi.

Game-Theoretics of Cybersecurity

With all this interdependence the logical next step is game-theoretic analysis. Hausken (Citation2002) represents an intermediate step between defense economics and cybersecurity. He reinterprets contribution aggregators within the context of network topology in order to characterize risk reduction interdependencies. For example, a serial network is similar to a perimeter defense such as a firewall to maintain security of a system. It is akin to a weakest-link public good with a coordination game structure. By contrast, a parallel network has built-in redundancy akin to a best-shot technology, resulting in either a Battle of the Sexes or Chicken game. Finally, if a network’s security is characterized by cumulative interdependencies (e.g. stem from the sum of individual cybersecurity efforts on the attack surface), free riding can lead to a Prisoner’s Dilemma. The topology of vulnerability also affects how malicious actors combine their attacks (Borg Citation2005), with the objective of producing intensifier effects for parallel systems, cascading effects for interdependent systems, and multiplier effects for weakest-link systems.

Interdependent (Information) Security Games

The workhorse model for cybersecurity games has the economics of counterterror at its roots. The interdependent security game – IDS (Kunreuther and Heal Citation2003; Heal and Kunreuther Citation2005) – captures how security in networks such as airlines leads to mutually reinforcing investments, whereas security lapses lead to tipping or cascading into security anarchy. Interdependence is also a core property of networked information systems (Laszka, Felegyhazi, and Buttyan Citation2014). Notable extensions in the cybersecurity literature include modeling the IDS externality beyond ‘one hop’ (cascading or tipping) between users; and users’ (in)security propagated indirectly, depending on whether or not others in the network have been compromised, rather than depending on the security investment made by network members. To date, propagation models borrow heavily from public health models of epidemics and inoculation and have yet to incorporate malware-specific propagation attributes of the type identified by Karyotis and Khouzani (Citation2016).

At the same time, malicious actors are also playing an IDS game against each other. Florêncio and Herley (Citation2013) observe while cybersecurity may be a public good for users, to the extent that malicious actors are financially-motivated their rewards are excludable and rival. Furthermore, the benefits of a successful individual attack are likely to be far less than the cost to the user. To be profitable, malicious actors must operate at scale and with low variable cost. We have all received a phishing email from a Nigerian prince and few of us have been victimized. Hence, what is a weakest-link game from the perspective of users is a sum of (users’) effort game for malicious actors because the requisite large-scale attacks must be profitable in expectation across the population of users. This is a novel departure from the canonical weakest-link defense scenario where a single defender with an n-dimensional attack surface faces a single attacker. Moreover, it introduces competition among attackers in a way that is different from competitive signaling via outbidding by terrorist groups.

Price of Anarchy

What all of these models have in common is they identify inefficiencies when cybersecurity is determined via individual rationality within a decentralized interdependent system. A notable contribution of computer science to game theory in general is the price of anarchy (Koutsoupias and Papadimitriou Citation1999), characterizing the upper bound on the degree of inefficiency. The price of anarchy (PoA) is the ratio of the welfare measure of the cooperative solution to that of the Nash equilibrium.Footnote13 For example, if PoA = 4 then the Nash equilibrium yields 25% of the welfare achieved by the cooperative welfare measure. The PoA measures inefficiency relative to a worse-case scenario; it is a step beyond observing a Nash equilibrium may not be Pareto efficient. For example, many 2 × 2 games with proper names can be expressed in terms of cardinal payoffs T (for temptation), R (reward), P (punishment), and S (sucker), where TRPS > 0, and 2P > TS. Using the utilitarian/Benthamite welfare measure, a 2 × 2 Prisoner’s Dilemma has a PoA equal to R/P (= [RR]/[PP]). By contrast, in a Chicken game the pure strategy PoA is 2 R/[TP] if 2 RTP and 1 (efficient) otherwise. For this welfare measure, the Prisoner’s Dilemma has a greater efficiency loss than Chicken.Footnote14 Computer scientists regard the price of anarchy as measuring the potential for welfare improvement via either regulation or mechanism design to get around inefficient outcomes stemming from decentralized competition.

Surveys

As of this writing the extant literature includes upwards of 40 surveys on the application of game theory to cybersecurity and a handful on game theory and blockchain. The majority of these are views from 35,000 feet. Notable exceptions include Bandyopadhyay and Reda (Citation2008), Cavusoglu, Raghunathan, and Yue (Citation2008), Manshaei et al. (Citation2013), Fielder et al. (Citation2014), Laszka, Felegyhazi, and Buttyan (Citation2014), and Fedele and Roner (Citation2022). Fedele and Roner (Citation2022) employ a unified framework spanning the partial equilibrium analysis of Gordon and Loeb (Citation2002) to game-theoretic extensions.

Cryptography

When it comes to cybersecurity, Shostack and Stewart (Citation2008) claim, ‘amateurs study cryptography; professionals study economics.’ Yet cryptography itself has economic implications. Both economics and cryptography deal with interactions of mutually distrustful agents. In economics, agents are rational. In cryptography, agents are either honest or malicious. These stereotypes should not be taken too seriously. For example, in economics agents can be boundedly rational or possess intrinsic/social preferences, with a wide berth given as to what this means. In cryptography, a similarly wide berth is given to the meaning of malicious agents (e.g. they may have no malicious intent but merely be faulty (error-prone)). A Byzantine player is a malicious agent with the objective of reducing the overall efficiency of the system, and therefore the likelihood of consensus.Footnote15 In addition, in cryptography agents are computationally-limited. Both disciplines have independent research programs into what it takes for agents to trust each other.

Honest and malicious agents can be modeled via the game-theoretic construct of a type space of players (Moscribroda, Schmid, and Wattenhofer Citation2006). Moreover, in cryptography a single adversary can coordinate actions of malicious types to optimize a well-defined objective function. The computational issue is another matter. Economists acknowledge the computational issue that many games have multiple Nash equilibria. After decades of game-theoretic research into equilibrium refinements, little consensus exists other than using subgame perfection in extensive form games as a means to test for credible actions. Multiple equilibria aside, even though Nash equilibria exist under general conditions, finding Nash equilibrium is computationally intractable (Chen, Deng, and Teng Citation2009; Daskalakis, Goldberg, and Papadimitriou Citation2009), with no efficient algorithm existing for finding or even approximating Nash equilibria (Rubinstein Citation2019). Hence, the question as to how the parties involved arrive at a Nash equilibrium remains unanswered from a computational perspective.

This raises the issue of how to reach equilibrium via consensus. Economists have focal points, cheap talk, correlated equilibrium, and mechanism design. Cryptologists have protocols (processes to ensure trust). Two-factor authentication is an example of a protocol. Proof of work in blockchains is another example. Mechanism design and cryptography are duals in the following sense: mechanism design attempts to force the revelation of information, while cryptography attempts to allowing its hiding (Shoham Citation2008). Yet a fundamental principle of cryptography is hiding occurs in plain sight; cryptography does not ensure security through obscurity; i.e. keeping the details of cryptographic algorithms secret. Instead it is the secret key (e.g. password) that, when combined with the algorithm, provides security (Kerckhoffs Citation1883). This prevents reverse engineering of flawed algorithms. An example of a representative question linking cryptography with game theory is whether an outcome achievable by an outside mediator (e.g. correlation device) can be replaced by a protocol run by the agents themselves (Katz Citation2008).

Cryptography, the art of secret-preserving protocols or breaking such protocols, further increases complexity. The objective is for the protocol to be secure so all honest agents reach consensus and no information is leaked to malicious agents. A protocol is λ-secure if the adversary does not possess sufficient resources to prohibit honest players from reaching consensus in time polynomial in λ.Footnote16 Hence, honest agents are computationally bounded in it may take time polynomial in λ to reach consensus, and the adversary is both time limited and resource-constrained. One way to think about the adversary’s resource constraint is, if there are N players in the game, the adversary at best controls collusions of malicious agents of cardinality k, where kN. This is clearly related to coalition-proofness in game theory. Results on security protocols usually characterize upper bounds on k as a linear fraction of N (Dodis and Rabin Citation2007).

One way to reconcile the rationality requirements of game theory with the limited computational environments of cryptography is through ‘as if’ equilibrium concepts. That is, a game can be solved as if it is a computationally complex problem. An example is ε-equilibrium, where ε is a function of λ,ε=ελ, and all strategies must be computable in time polynomial in λ. For payoff function Π, strategy si for player ‘i,’ and strategy vector si for all players other than ‘i,’ the requirement becomes Πsi,si Πsi,siελi. A recent alternative criterion in this vein is pseudo-stable equilibrium (Alwen, Psomas, and Zikas Citation2021).

In a remarkably prescient analysis, Young and Yung (Citation1996) discuss how cryptography

can go awry; i.e. be used offensively for malicious ends. Their definition of cryptography is, ‘difference in computational capacity.’ Instead of providing defensive benefits (privacy, authentication, and secrecy), cryptography can be used to

mount extortion-based attacks that cause loss of access to information, loss of confidentiality, and information leakages; tasks cryptography typically prevents.

Among several potentialities enumerated by Young and Yung, I note:

A cryptovirus (cryptotrojan) is a computer virus (Trojan horse) that uses a public key generated by the author to encrypt data D that resides on the host system in such a way that D can only be recovered by the author of the virus … Immediately following encryption, the cryptovirus notifies the user and demands that the user contact the virus writer. Once contacted, the virus writer demands a ransom in return for the private key.

Young and Yung effectively foretell the advent of ransomware. The pecking order of ransomware targeting stands in stark contrast to that of terrorism. Terrorism’s historic pecking order is down-level: first governments, then businesses, now civilians (Brandt and Sandler Citation2010). In contradistinction, ransomware is moving up-level from grandparents’ outdated PCs and small businesses to becoming a big game hunting concern, with victims such as universities, municipalities, and large corporations. Ability to pay, inability to sustain downtime, and working on-the-fly with data (preventing frequent backups) are desirable traits of ransomware targets (e.g. hospitals). Moreover, Arce (Citation2022) argues CSP encryption of user data is tantamount to legal ransomware via security-induced lock-in. Encrypted data is neither portable nor interoperable, thereby making it difficult for users to switch CSPs.

HT Public key infrastructure (PKI) remains a weak link for electronic authentication and validation of users and devices (Jenkinson Citation2020). PKI certificates validate decryption privileges. Weak, compromised, unauthorized, expired, revoked, or stolen digital certificates can lead to man-in-the-middle attacks (National Security Agency (NSA) Citation2021). In 2011 Iran compromised third-party certificates to monitor Iranian Gmail accounts. Hijacked insecure and unmanaged domains of certificate authorities (such as SolarWinds) serve similar purposes. Obsolete key exchange methods and transmission channels provide a false sense of security. ‘Nation-state and sufficiently resourced actors are able to exploit these weak communications’ (National Security Agency (NSA) Citation2021). Furthermore, zero-trust critically relies on up-to-date PKI.

Conclusion

This study highlights commonalities between defense economics and cybersecurity as subfields of public economics. Theoretical tools in both fields include microeconomics and game theory. These tools enable analyses of phenomena present in both milieus: public goods, externalities, commons, incentives, interdependent security, and inefficiency of decentralized decision making. Cyberspace is the battlespace, with much of cyberspace having the structure of platforms (two-sided markets). Cyber war, cyberterrorism, deterrence, and disinformation are modes of behavior within the battlespace. further summarizes these commonalities.

Table 3. Cybersecurity and defense economics commonalities.

Cybersecurity policy is currently stymied by the conundrum of organized eCrime seeking safe haven in jurisdictions where they refrain from targeting entities with exploits such as ransomware. This has parallels for defense economists knowledgeable in the paid riding phenomenon for terrorist havens in northern Mediterranean nations several decades ago.

Deterrence remains a major concern – not because of difficulties in attributing the adversary – but because a threatened cyber response must itself be visible and attributable. Such threats reveal tradecraft. Cyber deterrence likely requires a hybrid response. Overall, the defense-cyber nexus is an area for future research. Studies of note on hybrid uses of cyber tactics include Liang and Xiangsui (Citation1999), Kostyuk and Zhukov (Citation2019), Lindsay and Gartzke (Citation2019), and Balcaen, Du Bois, and Buts (Citation2022a,Citation2022b). Hybrid conflict is akin to rock-paper-scissors in that each actor has a superior countermove in an alternative domain given sufficient intelligence about their adversary’s intentions (i.e. a second-mover advantage exists).

By embracing platform economics, defense economists can bring to bear their knowledge of the logic and benefits of security. A secure platform is the gateway to the private benefits accrued by all platform participants. When security is breached, these benefits are diminished or lost. Cybersecurity both determines a platform’s competitive environment and must meet economic criteria in addition to technical conditions.

Finally, a cursory check of EconLit reveals the economics profession has yet to fully engage with cybersecurity even though it is fertile ground where critical issues are economic at their core and affect our daily lives. At the same time, the Workshop on the Economics of Information Security (WEIS) is into its third decade of existence. Moreover, Swire (Citation2018) estimates 70% of future jobs in cybersecurity will not involve coding. To borrow a phrase from Hardin (Citation1968), ‘there is no technical solution.’ Cybersecurity is an area ripe for defense economists to ply their trade.

Acknowledgement

* I thank Rebecca Cordell, Daniel Woods, participants in the 2022 ICES conference, and two anonymous referees for comments.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1. Confidentiality refers to the degree to which data/information is restricted to authorized personnel. Integrity refers to the degree to which data/information can only be altered and transmitted by authorized personnel. Availability refers to the degree to which data/information is robustly accessible to authorized personnel.

2. The equivalence of cybersecurity and information security is made because the literature crosses both areas. Moreover, some see cybersecurity as a subset of information security because not all information is held digitally. Others see information security as a subset of cybersecurity because the former does not include non-information assets such as the protection of critical infrastructure. See Von Solmes and van Niekerk (Citation2013).

3. Gilad, Pech, and Tishler (Citation2021) define APTs as highly-skilled and well-funded (possibly nation-sponsored) groups of cyber-attackers keen on long-term operational efforts and endowed with technological sophistication and R&D capabilities.

4. More recently, the DoD’s Joint Publication 3-12 (2018) defines cyberspaces as, “the domain within the information environment that consists of the interdependent network of information technology (IT) infrastructures and resident data. It includes the Internet, telecommunications networks, computer systems, and embedded processor and controllers.

5. NSA notified Microsoft about the vulnerability and it issued a patch before Shadow Brokers’ release of the key.

6. See the Mitre ATT&CK wiki: https://attack.mitre.org/.

7. An unverified earlier claim exists for a 1982 explosion of a Russian pipeline said to be caused by the CIA inserting malicious code into the supervisory control and data acquisition (SCADA) software (Reed Citation2005).

8. E.g. see Schneier’s (Citation2018) discussion of differential cryptanalysis

9. Lindsay (Citation2015) provides an accessible game-theoretic treatment of deterrence under imperfect attribution.

10. A botnet is a hijacked internet device (bot) under the command and control of a malicious actor. Users may be unaware their device is part of a botnet. DDoS is a distributed denial of service attack involving a massive number of botnet requests to take a website down. Another use of botnets is for click fraud.

11. Hietzenrater, Taylor, and Simpson (Citation2015) provide complementary screening and signaling model analyses.

12. Equivalently, the defender uses a mixed strategy such that the attacker is indifferent between its targets.

13. If payoffs are measured in terms of costs, then the inverse of this ratio is used.

14. Alternatively, using the Rawlsian welfare criterion (maximizingi the minimum payoff over all outcomes) yields the same PoA for the Prisoner’s Dilemma but a different PoA of R/P for Chicken, thereby making the games equally inefficient. This is the case even though the pure Nash equilibria of Chicken are Pareto efficient.

15. Blockchains are both distributed ledgers and solutions to Lamport, Shostak, and Pease’s (Citation1982) Byzantine general’s problem.

16. If n is the number of bits needed as input to solve the problem, then polynomial in time λ means it may take time nλ in the limit to solve the problem. Polynomial time is regarded as a reasonable amount of complexity.

References

  • Adaros Boye, C., P. Kearney, and M. Josephs 2018. “Collective Responsibility and Mutual Coercion in IoT Botnets. A Tragedy of the Commons Problem.” Proceedings of the 15th International Conference on eBusiness and Telecommunications (ICETE), Porto, Portugal. 1: 470–4880.
  • Allen Hamilton, B. 2020. “Bearing Witness: Uncovering the Logic behind Russian Military Cyber Operations.”
  • Alwen, J., A. Psomas, and V. Zikas. 2021. “Pseudo-stability: A Crypto-Friendly Equilibrium Notion.” Presented at the Workshop on Distributed Ledgers and Economics. July. Stoney Brook, NY: Stony Brook Center for Game Theory.
  • Anderson, R. 2001. “Why Information Security Is Hard – An Economic Perspective.” Proceedings of the Seventeenth Annual Computer Science Applications Conference. New Orleans: IEEE, pp.358–365.
  • Anderson, R., and T. Moore. 2006. “The Economics of Information Security.” Science 314 (5799): 610–613. doi:10.1126/science.1130992.
  • Arce, D. G. 2018. “Malware and Market Share.” Journal of Cybersecurity 4 (1): 1–6. doi:10.1093/cybsec/tyy010.
  • Arce, D. G. 2020. “Cybersecurity and Platform Competition in the Cloud.” Computers & Security 93: 1–8. doi:10.1016/j.cose.2020.101774.
  • Arce, D. G. 2022. “Security-Induced Lock-In in the Cloud.” Business and Information Systems Engineering, preprint.
  • Arce, D. G., and T. Sandler. 2001. “Transnational Public Goods: Strategies and Institutions.” European Journal of Political Economy 17 (3): 493–516. doi:10.1016/S0176-2680(01)00042-8.
  • Arquilla, J., and D. Ronfeldt. 1993. “Cyberwar Is Coming!” Comparative Strategy 12 (2): 141–165. doi:10.1080/01495939308402915.
  • Ayrour, Y., A. Raji, and M. Nassar. 2018. “Modeling Cyber-Attacks: A Survey Study.” Network Security 3 (March): 13–19. doi:10.1016/S1353-4858(18)30025-4.
  • Baker, S. 2012. “Cybersecurity and the Attribution Problem: Good News at Last?” https://www.skatingonstilts.com/skating-on-stilts/2012/10/my-entry.html Accessed 21 September 2022.
  • Balcaen, P., C. Du Bois, and C. Buts. 2022a. “A Game-theoretic Analysis of Hybrid Threats.” Defence and Peace Economics 33 (1): 26–41. doi:10.1080/10242694.2021.1875289.
  • Balcaen, P., C. Du Bois, and C. Buts. 2022b. “The Hybridisation of Conflict: A Prospect Theoretic Analysis.” Games 12. article 81.
  • Bandyopadhyay, T., and S. Reda 2008. “Countering Cyber Terrorism: Investment Models under Decision and Game Theoretic Frameworks.” 2008 Proceedings of the Southern Association for Information Systems Conference. Richmond, VA: SAIS, 36.
  • Bauer, J. M., and M. Van Eeten. 2011. “Introduction to the Economics of Cybersecurity.” Communications & Strategies 81: 12–21.
  • Becker, G. S. 1962. “Crime and Punishment: An Economic Approach.” Journal of Political Economy 76 (2): 169–217. doi:10.1086/259394.
  • Beissel, S. 2016. Cybersecurity Investments: Decision Support Under Economic Aspects. Switzerland: Springer.
  • Böhme, R. 2010. “Security Metrics and Security Investment Models.” In Advances in Information and Computer Security, IWSEC 2010, LNCS 6364, edited by I. Echizen, 10–24. Berlin: Springer.
  • Böhm, R., and T. Moore. 2016. “The “Iterated Weakest Link” Model of Adaptive Security Investment.” Journal of Information Security 7 (2): 81–102. doi:10.4236/jis.2016.72006.
  • Borg, S. 2005. “Economically Complex Cyberattacks.” IEEE Security & Privacy 3 (4): 64–67. doi:10.1109/MSP.2005.146.
  • Brandt, P. T., and T. Sandler. 2010. “What Do Transnational Terrorists Target? Has It Changed? are We Safer?” Journal of Conflict Resolution 54 (2): 214–236. doi:10.1177/0022002709355437.
  • Briscoe, B., A. Odlyzko, and B. Tilly. 2006. “Metcalfe’s Law Is Wrong.” IEEE Spectrum 43 (7): 34–39. doi:10.1109/MSPEC.2006.1653003.
  • Cavusoglu, H., S. Raghunathan, and W. T. Yue. 2008. “Decision-Theoretic and Game-Theoretic Approaches to IT Security Investment.” Journal of Management Information Systems 25 (2): 281–304. doi:10.2753/MIS0742-1222250211.
  • Chen, T. M. 2010. “Stuxnet, the Real Start of Cyber Warfare?” IEEE Network 24 (6): 2–3.
  • Chen, X., X. Deng, and S.-H. Teng. 2009. “Settling the Complexity of Two-Player Nash Equilibria”. Journal of the ACM. Article 14 56 (3): 1–57. 10.1145/1516512.1516516.
  • Chen, J., T. Ma, and P. Wei. 2013. “Study of Cyberspace Factors and Description Methods.” Applied Mechanics and Materials 427-429: 2477–2480. https://doi.org/10.4028/www.scientific.net/AMM.427-429.2477.
  • Chien, E. 2010. “Stuxnet: A Breakthrough.” Symantec (blog) 12 November 2010 https://community.broadcom.com/symantecenterprise/communities/community-home/librarydocuments/viewdocument?DocumentKey=550505c5-c38a-4e0c-b590-f731bb3a60ad&CommunityKey=1ecf5f55-9545-44d6-b0f4-4e4a7f5f5e68&tab=librarydocuments Accessed 21 September 2022.
  • Clarke, R. A., and R. K. Knake. 2019. The Fifth Domain. New York: Penguin.
  • Clement, N., and D. Arce 2022. “Dynamics of Shared Security in the Cloud.” Working Paper.
  • Crowdstrike 2022. Global Threat Report.
  • Daskalakis, C., P. W. Goldberg, and C. Papadimitriou. 2009. “The Complexity of Computing Nash Equilibrium.” Communications of the ACM 43 (2): 59–92.
  • Deakin, R. S. 2010. Battlespace Technologies: Network-Enabled Information Dominance. Norwood, CT: Artech House.
  • Denning, D. 2000. Cyberterrorism: Testimony Before the Special Oversight Panel on Terrorism. U.S. House of Representatives, 23 May 2000. https://irp.fas.org/congress/2000_hr/00-05-23denning.htm
  • DiMaggio, J. 2022. The Art of Cyberwarfare. San Francisco: No Starch Press.
  • Dodis, Y., and T. Rabin. 2007. “Cryptography and Game Theory.” In Algorithmic Game Theory, edited by N. Nisan, 181–205. New York: Cambridge University Press.
  • Dresher, M. 1961. “Games of Strategy.” In Theory and Application. Santa Monica: RAND.
  • Dunne, J. P., and E. Sköns. 2021. “New Technology and the U.S. Military Industrial Complex.” Economics of Peace and Security Journal 16 (2): 5–17.
  • Enders, W., and T. Sandler. 1993. “The Effectiveness of Anti-Terrorism Policies: A Vector-Autoregression-Intervention Analysis.” American Political Science Review 87 (4): 829–844. doi:10.2307/2938817.
  • Fedele, A., and C. Roner. 2022. “Dangerous Games: A Literature Review on Cybersecurity Investments.” Journal of Economic Surveys 36 (1): 157–187. doi:10.1111/joes.12456.
  • Fielder, A., E. Panaousis, P. Malacaria, C. Hankin, and F. Smeraldi.2014. “Game Theory Meets Information Security Management”. In ICT Systems Security and Privacy Protection. IFIP Advances in Information and Communication Technology, edited by N. Cuppens-Boulahia, F. Cuppens, and J. Jajodia. Vol. 42. 15–29. Berlin: Springer.
  • Florêncio, D., and C. Herley. 2013. “Where Do All the Attacks Go?” In Economics of Information Security and Privacy III, edited by B. Schneier, 13–33, New York: Springer.
  • Gaibulloev, K., C. Kollias, and B. Solomon. 2020. “Defence and Peace Economics: The Third Decade in Retrospect.” Defence and Peace Economics 31 (4): 377–386. doi:10.1080/10242694.2020.1761221.
  • Garcia, A., Y. Sun, and J. Shen. 2014. “Dynamic Platform Competition with Malicious Users.” Dynamic Games & Applications 4 (3): 290–308. doi:10.1007/s13235-013-0102-y.
  • Geer, D., E. Jardine, and E. Leverett. 2020. “On Market Concentration and Cybersecurity Risk.” Journal of Cyber Policy 5 (1): 9–29. doi:10.1080/23738871.2020.1728355.
  • Gilad, A., E. Pech, and A. Tishler. 2021. “Intelligence, Cyberspace, and National Security.” Defence and Peace Economics 32 (1): 18–45. doi:10.1080/10242694.2020.1778966.
  • Gilman, E., and D. Barth. 2017. Zero Trust Networks. Boston: O’Reilly.
  • Gordon, S., and R. Ford. 2002. “Cyberterrorism?” Computers & Security 21 (7): 636–647. doi:10.1016/S0167-4048(02)01116-1.
  • Gordon, L. A., and M. P. Loeb. 2002. “The Economics of Information Security Investment.” ACM Transactions on Information and System Security 5 (4): 438–457. doi:10.1145/581271.581274.
  • Gordon, L. A., and M. P. Loeb. 2006. “Managing Cyber-Security Resources.” In A Cost-Benefit Analysis. New York: McGraw-Hill.
  • Gordon, L. A., M. P. Loeb, W. Lucyshyn, and L. Zhou. 2015. “Externalities and the Magnitude of Cyber Security Underinvestment by Private Sector Firms: A Modification of the Gordon-Loeb Model.” Journal of Information Security 6 (1): 24–30. doi:10.4236/jis.2015.61003.
  • Hadnagy, C. 2018. Social Engineering: The Science of Human Hacking. Indianapolis: Wiley.
  • Hardin, G. 1968. “The Tragedy of the Commons.” Science 162 (3859): 1243–1248. doi:10.1126/science.162.3859.1243.
  • Hausken, K. 2002. “Probabilistic Risk Analysis and Game Theory.” Risk Analysis 22 (1): 17–27. doi:10.1111/0272-4332.t01-1-00002.
  • Hausken, K. 2006. “Returns to Information Security Investment: The Effect of Alternative Information Security Breach Functions on Optimal Investment and Sensitivity to Vulnerability.” Information Systems Frontiers 8 (5): 338–349. doi:10.1007/s10796-006-9011-6.
  • Heal, G., and H. Kunreuther. 2005. ““IDS Model of Airline Security.” “Journal of Conflict Resolution 49 (2): 201–217. doi:10.1177/0022002704272833.
  • Herley, C., and D. Florêncio 2008. “A Profitless Endeavor: Phishing as Tragedy of the Commons.” Proceedings of the 2008 New Security Paradigms Workshop. New York: ACM, pp.59–70.
  • Hietzenrater, C., G. Taylor, and A. Simpson. 2015. “When the Winning Move Is Not to Play: Games of Deterrence in Cyber Security.” In Decision and Game Theory for Security, GamesSec 2015, LCNS 9406, edited by M. H. R. Khouzani, et al., 250–269. London: Springer.
  • Hirshleifer, J. 1983. “From Weakest-Link to Best-Shot: The Voluntary Provision of Public Goods.” Public Choice 41 (3): 371–386. doi:10.1007/BF00141070.
  • Howard, R. 2020. “Cybersecurity First Principles.” Cyberwire Pro Podcast, 11 May 2020. https://thecyberwire.com/pro/podcasts
  • Hutchins, E., M. Cloppert, and R. Amin 2011. “Intelligence-Driven Computer Network Defense by Analysis of Adversary Campaigns and Intrusion Kill Chains.” In Proceedings of the 6th International Conference on Information Warfare and Security, edited by L. Armistead, 113–125, Academic Publishing International.
  • Jegers, M., and L. Van Hove. 2020. “Malware and Market Share: A Comment on Arce.” Journal of Cybersecurity 16 (1): tyaa024. doi:10.1093/cybsec/tyaa024.
  • Jenkinson, A. 2020. “The History of Cryptography and the Modern Enigma of Digital Certificates.” Digital Forensics Magazine, August, pp.8–12.
  • Karyotis, V., and M. H. R. Khouzani. 2016. “Malware Diffusion Models for Modern Complex Networks.” In Theory and Applications, edited by E. Panousis, and G. Theodorakopoulos, Cambridge, MA: Morgan Kaufman.
  • Katz, J. 2008. “Bridging Game Theory and Cryptography: Recent Results and Future Directions.” In Theory of Cryptography. TCC 2008. LNCS, edited by R. Cantetti, 251–272. Vol. 4948. Berlin: Springer.
  • Kerckhoffs, A. 1883. “La Cryptographie Militaire.” Journal Des Sciences Militaries 9: 161–191.
  • Klein, J. J. 2015. ”Deterring and Disuading Cyberterrorism.” Journal of Strategic Security 4 (8): Article 2.
  • Kostyuk, N., and Y. M. Zhukov. 2019. “Invisible Digital Front: Can Cyber Attacks Shape Battlefield Events?” Journal of Conflict Resolution 62 (2): 317–343.
  • Koutsoupias, E., and C. Papadimitriou. 1999. “Worst Case Equilibria.” In Annual Symposium on the Theoretical Aspects of Computer Science, STACS 1999. Lecture Notes in Computer Science, edited by C. Meinel and S. Tison, 404–413. Vol. 1563. Berlin: Springer.
  • Kreps, D. M. 2019. Microeconomics for Managers. Second ed. Princeton: Princeton University Press.
  • Kunreuther, H., and G. Heal. 2003. “Interdependent Security.” Journal of Risk and Uncertainty 26 (2/3): 231–249. doi:10.1023/A:1024119208153.
  • Lamport, L., R. Shostak, and M. Pease. 1982. “The Byzantine General’s Problem.” ACM Transactions on Programming Languages and Systems 4 (3): 382–401. doi:10.1145/357172.357176.
  • Laszka, A., M. Felegyhazi, and L. Buttyan. 2014. “A Survey of Interdependent Information Security Games.” ACM Computing Surveys 47 (2): Article 23.
  • Lee, D. 1988. “Free Riding and Paid Riding in the Fight against Terrorism.” American Economic Review, Papers & Proceedings 78 (2): 22–26.
  • Lee, D., and T. Sandler. 1989. “On the Optimal Retaliation against Terrorists: The Paid Rider Option.” Public Choice 61 (2): 141–152. doi:10.1007/BF00115660.
  • Lehto, M. 2018. “The Modern Strategies in the Cyber Warfare.” In Cyber Security: Power and Technology, Intelligent Systems, Control and Automation: Science and Engineering 93, edited by M. Lehto and P. Neittaanmäki, 3–20, London: Springer Nature.
  • Lemay, A., J. Calvet, F. Menet, and J. M. Fernandez. 2018. “Survey of Publicly Available Reports on Advanced Persistent Threat Actors.” Computers & Security 72: 26–59. doi:10.1016/j.cose.2017.08.005.
  • Liang, Q., and W. Xiangsui. 1999. Unrestricted Warfare. Battleboro, VT: Echo Point.
  • Lindsay, J. R. 2015. “Tipping the Scales: The Attribution Problem and the Feasibility of Deterrence against Cyberattack.” Journal of Cybersecurity 1 (1): 53–67.
  • Lindsay, J. R., and E. Gartzke, edited by. 2019. “Cross-Domain Deterrence.” In Strategy in an Era of Complexity. Oxford: Oxford University Press.
  • Lindsay, J. R., and E. Gartzke. 2022. “Politics by Many Other Means: The Comparative Strategic Advantages of Operational Domains.” Journal of Strategic Studies 45 (5): 743–776. doi:10.1080/01402390.2020.1768372.
  • Littlewood, B., S. Brocklehurst, N. Felton, P. Mellor, S. Paige, S. Wright, J. Dobson, J. McDermid, and D. Gollmann. 1993. “Towards Operational Measures of Computer Security.” Journal of Computer Security 2 (3): 211–229. doi:10.3233/JCS-1993-22-308.
  • Lundbohm, E. 2017. “Understanding Nation-State Attacks.” Network Security 10 (Oct): 5–8. doi:10.1016/S1353-4858(17)30101-0.
  • Manshaei, M. H., Q. Zhu, T. Alpcan, T. Başar, and J.-P. Hubaux. 2013. “Game Theory Meets Network Security and Privacy.” ACM Computing Surveys 45 (3): 1–39. doi:10.1145/2480741.2480742.
  • Meiklejohn, S., M. Pomarole, G. Jordan, K. Levchenko, D. McCoy, G. M. Voelker, and S. Savage. 2013. “A Fistful of Bitcoins: Characterizing Payments among Men with No Names.” Proceedings of the 2013 Conference on Internet Measurement, IMP ’13. New York: ACM, pp.127–139.
  • Moscribroda, T., S. Schmid, and R. Wattenhofer 2006. “When Selfish Meets Evil: Byzantine Players in a Virus Inoculation Game.” Proceedings of the 25th ACM Symposium on Principles of Distributed Computing, PODC 2006. New York: ACM, pp.35–44.
  • Mougayar, W. 2016. The Business of Blockchain. Hoboken, NJ: Wiley.
  • Musin, T. 2021. “Estimation of Global Public IaaS Market Concentration by Linda Index.” SHS Web of Conferences, Moscow. 114: 1–10.
  • Mussman, S., and A. Turner. 2018. “A Game Theoretic Approach to Cyber Security Risk Management.” Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 15 (2): 127–146. doi:10.1177/1548512917699724.
  • National Security Agency (NSA) 2021. “Eliminating Obsolete Transport Layer Security (TLS) Protocol Configurations.” U|00|197443-20|PP-20-1302|JAN 2021 Ver 1.0.
  • Plotnick, J. J., and J. Slay. 2021. “Cyber Terrorism: A Homogenized Taxonomy and Definition.” Computers & Security 102. article 102145.
  • Powell, B. 2005. “Is Cybersecurity a Public Good? Evidence from the Financial Services Industry.” Journal of Law, Economics, and Policy 1 (2): 497–510.
  • Preibusch, S., and J. Bonneau. 2010. “The Password Game: Negative Externalities from Weak Password Practices.” In Decision and Game Theory for Security, GameSec 2010, LCNS 6442, edited by T. Alpcan, L. Buttyán, and J. S. Baras, 192–207, Berlin: Springer.
  • Reed, T. C. 2005. At the Abyss. New York: Random House.
  • Rid, T. 2020. Active Measures. The Secret History of Disinformation and Political Warfare. New York: Farrar, Straus and Giroux.
  • Rose, N. 2013. “Shaping the Future Battlespace: Offensive Cyber Warfare Tools for the Planner.” Australian Army Journal 10 (4): 53–68.
  • Rubinstein, A. 2019. Hardness of Approximation between P and NP. Berkeley: Association for Computing Machinery and Morgan & Claypool.
  • Schneier, B. 2018. Click Here to Kill Everybody. New York: Norton.
  • Sen, S., R. Guerin, and K. Hosanagar. 2011. “Functionally-Rich versus Minimalist Platforms: A Two-Sided Market Analysis.” ACM SIGCOMM Computer Communication Review 41 (5): 36–43. doi:10.1145/2043165.2043171.
  • Sen, R., A. Verma, and G. R. Heim. 2020. “Impact of Cyberattacks by Malicious Hackers on the Competition in Software Markets.” Journal of Management Information Systems 37 (1): 191–216. doi:10.1080/07421222.2019.1705511.
  • Shoham, Y. 2008. “Computer Science and Game Theory.” Communications of the ACM 51 (8): 75–79. doi:10.1145/1378704.1378721.
  • Shostack, A., and A. Stewart. 2008. The New School of Information Security. Boston: Pearson.
  • Soesanto, S. 2022. The IT Army of Ukraine. Structure, Tasking, and Ecosystem. ETH Zurich: Center for Security Studies (CSS).
  • Stewart, A. J. 2021. “A Vulnerable System.” In The History of Information Security in the Computer Age. Ithaca, NY: Cornell University Press.
  • Stiennon, R. 2015. “A Short History of Cyber Warfare.” In Cyber Warfare, edited by J. A. Green, 7–32, New York: Routledge.
  • Swire, P. 2018. “Privacy and Security. A Pedagogical Cybersecurity Framework.” Communications of the ACM 61 (1): 23–26. doi:10.1145/3267354.
  • Tafoya, W. L. 2011. “Cyber Terror.” FBI Law Enforcement Bulletin 80 (1): 1–7.
  • Theisen, C., N. Munaiah, M. Al-Zyoud, J. C. Carver, A. Meneely, and L. Williams. 2018. “Attack Surface Definitions: A Systematic Literature Review.” Computers and Security 104: 94–103.
  • U.S. State Department 2019. “World Military Expenditures and Arms Transfers 2019 Report.” https://2017-2021.state.gov/world-military-expenditures-and-arms-transfers-2019/index.html
  • U.S. Treasury Department 2016. “Guidance Concerning Stand-Alone Cyber Liability Insurance Policies Under the Terrorism Risk Insurance Program.” https://www.federalregister.gov/documents/2016/12/27/2016-31244/guidance-concerning-stand-alone-cyber-liability-insurance-policies-under-the-terrorism-risk
  • Varian, H. 2004. “System Reliability and Free Riding.” In Economics of Information Security, edited by L. J. Camp and S. Lewis, 1–15, Norwell, MA: Kluwer Academic.
  • Vasek, M., J. Wadleigh, and T. Moore. 2016. “Hacking Is Not Random: A Case-Control Study of Webserver-Compromise Risk.” IEEE Transactions on Dependable and Secure Computing 13 (2): 206–219. doi:10.1109/TDSC.2015.2427847.
  • Von Solmes, R., and J. van Niekerk. 2013. “From Information Security to Cyber Security.” Computers & Security 38 (19): 97–102. doi:10.1016/j.cose.2013.04.004.
  • Whaley, B. 1982. “Toward a General Theory of Deception.” Journal of Strategic Studies 5 (1): 178–192. doi:10.1080/01402398208437106.
  • Wilner, A. 2020. “US Cyber Deterrence: Practice Guiding Theory.” Journal of Strategic Studies 43 (2): 245–280. doi:10.1080/01402390.2018.1563779.
  • Young, A., and M. Yung 1996. “Cryptovirology: Extortion-Based Security Threats and Countermeasures.” Proceedings of the 1996 IEEE Symposium on Security and Privacy. Oakland, CA: IEEE, pp.129–140.
  • Zettner, K. 2011. “How Digital Detectives Deciphered Stuxnet, the World’s Most Menacing Malware in History.” Ars Technica, July 11. https://arstechnica.com/tech-policy/2011/07/how-digital-detectives-deciphered-stuxnet-the-most-menacing-malware-in-history/ Accessed 21 September 2022.