1,287
Views
9
CrossRef citations to date
0
Altmetric
Articles

OTHER-THAN-INTERNET (OTI) CYBERWARFARE: CHALLENGES FOR ETHICS, LAW, AND POLICY

Pages 34-53 | Published online: 17 Apr 2013
 

Abstract

Almost all discussions of cyberwarfare, other cyber-attacks, and cyber-espionage have focused entirely on the Internet as the chief means of damage – the Internet as a ‘vector,’ using a term from the theory of infectious diseases. However there are a variety of means, some of which have already been used, that involve cyber-exploitation using vectors other than the Internet. Malware can be installed in the integrated circuits of computers and servers, but also in any devices attached to them – thumb drives, CDs, printers, scanners, and so on. One can also use various forms of electromagnetic radiation at a distance of meters or more to exfiltrate data or to infiltrate corrupt data. I call this large and diverse family of unwanted interference with the functioning of information processing systems other-than-Internet (OTI) attacks on information systems. Stuxnet, and probably the 2007 Israeli corruption of Syrian air defenses, were OTI attacks. Such OTI techniques are more difficult to develop than creating malware, requiring electronic manufacturing facilities or novel technologies, and are thus accessible only to larger corporations and technologically sophisticated countries. Particularly vulnerable would be countries (like the United States and Europe) whose information processing devices are mainly produced outside of the country. Once exploitations via the Internet become harder to perpetrate, OTI exploitations are certain to grow dramatically, eventually requiring equipment for critical uses to be expensively fabricated in (and transported using) secure facilities; expensive detection measures will have to be instituted. This will create challenges for policy, law, and ethics, as well as greatly increasing the cost of many electronic devices.

Notes

1. The flaws in this and similar definitions are discussed below and in Dipert (forthcoming b). Since the key noun is ‘information,’ this definition may not literally include damage to the functioning of an information processing system.

2. Espionage and theft of intellectual property may, once known, cause the spied-upon country to take expensive countermeasures, such as an enhanced counter-espionage program or abandoning a novel weapon system once its vulnerabilities are made known. However, these costs are chosen by that nation, and thus do not directly and necessarily cause harm. For an excerpt from the rapidly increasing literature on the ethics of intelligence gathering, see Goldman (Citation2009).

3. This consideration was urged on me by Don Howard.

4. We have already entered a time when the appropriateness of certain cyber-responses is being tested in practice to see where non-escalating, game-theoretic equilibria can be found, or established by a convention of practice. The international reaction to Stuxnet has been muted.

5. The most striking instance of weapon-destroying weapons is in the classic science fiction film The Day the Earth Stood Still (1951).

6. I think this theory is deeply mistaken on several counts, but it does allow a diminished liability to be attacked, such as what non-combatants may incur if they have supported an unjust war.

7. In a draft document of unknown provenance (possibly from the J-5 of Cyber Command), ‘Cyberspace Operations Lexicon,’ there are modifications that offer a disjunction, including a target that is a ‘system’ that could be one computer. However, the attack and attacker are described as essentially network operations: ‘actions are taken through the use of computer networks.’

8. Technically, AM and FM broadcasts, both involving ‘modulation,’ and the conversion of sound into electrical current in old-fashioned telephones or spoken-to-written communication, all involve some element of information processing. In past systems, this processing was typically simple (using very short algorithms), and was between one mode of communication and another, for example electrical to radio signals. Modern information processing systems are much more complex (using many very large algorithms) and can be within one mode, such as electrical signals.

9. This exfiltration need not be destructive of the original data operating in the targeted information system. One typically copies the information content and it is this that is exfiltrated. This unfortunately illustrates the need for an improved and better understood language and ontology of data vs. information content, flow of information, data as contained in arrangements or patterns of charges and magnetism, and so on.

10. Another useful notion borrowed from the biomedical realm is the notion of a ‘signature.’ There are genetic signatures in DNA or RNA (such as for an inherited disease); there are similarly ‘signatures’ of various species of malware. In both cases they are sequences of entities (amino-acid pairs or bits) or patterns of such sequences. I am part of a project between my institution's National Center for Ontological Research (NCOR) and Pacific Northwest National Laboratories (PNNL), in its initial phase, to develop the systematic, formal ontology of the general notion of a signature.

11. This is from a personal conversation with Neil Rowe.

12. A difficulty is, however, that many of these OTI techniques come in two phases. The first is their installation or other infiltration of flawed data or malware. This has few ontological problems. The second is its activation. The activation of such malware and its contribution to the dysfunctioning of the information system brings back the ontological problem. Which is the attack (perhaps justifying counter-attack)? How much dysfunction, without permanent damage or deaths, constitutes morally actionable harm?

13. In the subsequent comment (Schmitt Citation2013, B.III. Rule 30 Comment 15: 94) the authors may be going too far in considering even intercepted attacks without damage as an attack. Cyber-powers may be initiating thousands of minor attacks a day on each other, with the expectation that cyber-defenses will block them.

14. ‘Spoofing’ is a falsification of the source IP address of a message identified in packets containing it.

15. Many of these ideas were developed while I was Fellow (2011–12) at the Stockdale Center for Ethical Leadership at the US Naval Academy – and in fact some of them may have originated from other members of this cyberwarfare research group. The views are, however, entirely my own, not those of other members or of the US government. A version of this paper was sketched in presentations at the John C. Reilly Center for Science, Technology, Values, Notre Dame University, and later at a workshop on cyber-power at Maxwell Air Force Base, both in August 2012. I benefited much from the discussion and astute comments of participants.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 196.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.