9,350
Views
3
CrossRef citations to date
0
Altmetric
Articles

AI, autonomy, and airpower: the end of pilots?

Pages 337-352 | Received 11 Jul 2019, Accepted 01 Oct 2019, Published online: 12 Oct 2019

ABSTRACT

Military pilots have long been central to airpower projection in both combat and non-combat operations. While the historical and contemporary roles of military aviators have been examined extensively in previous scholarship, the present study distinguishes itself by evaluating the future prospects of military aviators. By so doing, it argues that technological advances in autonomy and artificial intelligence (AI) will most likely lead to the development of pilotless aerial vehicles (PAVs), if current technological and social trends persist. In this new order, the military pilot will become a thing of the past.

Introduction

PilotsFootnote1 are instrumental in the armed forces of numerous nations. It is therefore hardly surprising that a vast body of literature on military aviators exists that assesses pilot selection (Bor et al. Citation2017, pp. 21–78), personalities (Chang et al. Citation2018), physical fitness (Rintala et al. Citation2015), job satisfaction (Ahmadi and Alireza Citation2007) and the impact of flying acesFootnote2, etc. In The Problem with Pilots (Citation2018), Timothy P. Schultz evaluates the evolution of the pilot–aircraft relationship from 1903–2017, and posits that machines have increasingly assumed the tasks previously performed by pilots. In contrast with this retrospective view, the present investigation is primarily concerned with the time ahead, in particular with the implications of technological and social developments for the future of military pilots. It is the first study to analyze this issue in-depth.

The specific research question that this article seeks to address is as follows: What is the likely future of military pilots, in light of current technological and social developments? An analysis of a wide variety of sources, ranging from interdisciplinary scholarly work to military documents indicates that autonomous technology and artificial intelligence (AI) will probably render the military pilot obsolete in the future.Footnote3 Indeed, the paper argues that manned aircraft with onboard pilots and unmanned aerial vehicles (UAVs)Footnote4 will likely give way to aircraft I refer to as pilotless aerial vehicles (PAVs). As a result, pilots will no longer be needed, but other humans may still be in the loop – developing, designing, testing and occasionally even making decisions on behalf of PAVs.

At this point, critics may contend that this research is speculative and that human agency makes it impossible to predict the pilot’s future role. Objections of this kind fail to recognize that while human behavior can be extremely unpredictable, it can also be highly regular. Whilst black swan events (e.g. the impact of the Web) are virtually impossible to predict prospectively, stable historic trends are far easier to envisage (Hofman et al. Citation2017, p. 487). Even in these “easier” cases, it is still important to proceed with caution. Social events are usually complex with multiple factors influencing outcomes and their effects are difficult to predict with great precision. Yet, it is vital to pursue this line of inquiry as falsifiable predictive studies are needed for the progression of social sciences (Kaplan Citation1940, Hofman et al. Citation2017). The current study therefore aims to contribute to these efforts by examining the likely effects of cutting-edge technological innovations on the future of pilots.

This issue is covered in more depth in the five remaining sections of this article. The first part looks at the challenges associated with finding and retaining pilots as well as their physical and psychological limitations. The next section demonstrates how technological solutions have already mitigated or overcome military aviators’ deficiencies and in so doing marginalized the pilot’s role. The third section argues that if the technological, military, political and institutional trends continue, pilots will most likely become redundant and replaced by autonomous systems with AI. The next section continues in the same vein but addresses this issue from an economic, legal and moral perspective instead. The fifth and final section briefly summarizes the overarching argument and maintains that the end of human pilots is unlikely to occur anytime soon.

Pilot challenges

During the infancy of aviation, the standards for pilot selection were lacking in scientific rigor. This changed as aviation matured and armed forces sought to select candidates with the “right stuff”, often favoring attributes such as high intelligence, an above-average level of extraversion and conscientiousness with low levels of neuroticism.Footnote5 In this view, high intelligence enables pilots to complete their training successfully. Extraversion facilitates the pilot’s interpersonal relationships with crew and staff as well as accurate and decisive communication during combat missions. Conscientiousness is important since pilots control extremely costly weapons system and perform demanding tasks. Finally, pilots need to remain calm and steady when encountering stressful situations during missions, hence the requirement of low levels of neuroticism (Wood et al. Citation2015). In addition to these traits, pilots are also expected to meet specific academic and physiological requirements concerning age, height, weight, fitness, and eyesight (“Royal Air Force,” Citation2019; “U.S. Air Force,” Citation2019a). Finding candidates that fulfil all these criteria is no easy task.

The challenge is not merely to find capable pilots but also to retain them once they have joined. Issues such as limited flying time with an increased emphasis on additional duties, and difficulties with maintaining work–life balance have prompted some pilots to reconsider their aviation career and leave for higher-paying professions with better prospects. The draw of commercial airlines hiring military pilots seems to have exacerbated the problem. One study finds that the US Air Force pilot shortages from 1950 and onwards typically coincide with aggressive hiring by airlines (Axe Citation2018). According to Lt. Gen. Gina M. Grosso, the airlines hired 4,100 pilots in 2016 alone and offer salaries that are on average 17 percent higher than the military (Parrish Citation2017).

The career of military pilots can also end prematurely due to injuries, mental health issues, and death, among others. During the Six-Day War in 1967, about 100 of Egypt’s 350 qualified air-combat pilots were for instance killed within three hours by Israel’s pre-emptive strikes (Henriksen Citation2018, p. 85). Even in the absence of such extreme outcomes, the full-length careers of military pilots tend to be relatively short. In the US Air Force, they are eligible to retire after 20 years, irrespective of their age (“U.S. Air Force,” Citation2019b). In 2018, the world’s oldest active fighter pilot, Phillip Frawley of the Royal Australian Air Force, retired at the age of 66 (Yeung Citation2018).

Consequently, pilot shortage is a major concern for countries around the world. In the US, the Air Force, the Navy, and the Marine Corps reported that the actual numbers of fighter pilots and authorizations (i.e. funded positions) gap in fiscal year 2017 were 27 percent, 24 percent and 26 percent, respectively (United States Government Accountability Office Citation2018). In November 2017, Air Force Secretary Heather Wilson disclosed that the US Air Force had a shortage of some 2,000 pilots (Daniels Citation2017). The Heritage Foundation’s 2018 Index of U.S. Military Strength rated the Air Force as weak, partly due to its current and forecast deficit of fighter pilots (Venable Citation2017a). A 2018 National Audit Office report stated that the British Armed Forces were 800 pilots short, or 23 percent below requirement (Morse Citation2018, p. 19). Even in the second most populous country in the world, India, a 2015 parliamentary report revealed that there was a fighter pilot shortage with a cockpit-pilot ratio of 1:1.08 rather than the mandated 1:1.25 (Kainikara Citation2018).

The training of pilots poses a further difficulty, as it requires significant investments in time and money. According to Air Force officials, “it costs between $3-$11 million and takes approximately 5 years to develop an individual fighter pilot to lead combat missions” (United States Government Accountability Office Citation2018). In order to keep the costs manageable, pilot flying time may be lowered. In the US, most pilots have reportedly less than 150 hours flying time per year that often takes place in very benign conditions. This is insufficient to acquire and develop the skills needed to attain victory in a high-threat combat environment (Venable Citation2017b, p. 5, 10). During the Second Chechen War in 1999, the average Russian pilot flying time was around 23–25 hours whereas the average flying time was about 150 hours for Soviet pilots during the Cold War (de Haas Citation2003).Footnote6 The lack of training contributed to the relatively poor performance of the Russian Air Force during the Second Chechen War (Sutyagin Citation2018, pp. 319–320). Moreover, the training itself may be of poor standard, leading to deficits in pilot skills. Despite the implementation of a number of improvements, the Chinese pilot training allegedly remains insufficient. According to a RAND report, Chinese pilots fall short in flight-lead, tactical and coordination skills and have difficulties operating autonomously as a result (Morris and Heginbotham Citation2016, pp. 26–27).

Even with ideal training, pilots with the “right stuff” suffer from all sorts of human cognitive and physiological limitations. All pilots are eventually overcome by hunger and fatigue, making it impossible for them to operate around the clock. Human senses are also inadequate for maintaining spatial orientation when flying in clouds, fog, or darkness. Pilots therefore become disoriented without instruments, irrespective of their skills and experience. Altitude poses another problem for the human body. At approximately 20,000 feet (6,096 meters) people begin to suffer from severely deficient supply of oxygen to the body while above 30,000 feet (9,144 meters) decompression sickness emerges. Flying duties can only be performed efficiently for a few seconds at 50,000 feet (15,240 meters) and at about 63,000 feet (19,202 meters) blood starts to boil at normal body temperature (Schultz Citation2018, p. 1). There are also limitations in terms of the G-forces that the human body can endure. Excessive G-force may lead to the loss of consciousness and even death (Venosa Citation2016). Finally, humans are susceptible to error that may in the worst case result in fatal outcomes. According to the 2015 Nall report, pilot-related errors accounted for roughly 74 percent of all total and fatal accidents in non-commercial fixed wing aircraft (Geske Citation2015, p. 12). As a result of the identified issues, the pilot’s role has become increasingly marginalized and replaced by more efficient technological solutions.

The decline of pilots

Initially, aircraft were technically simple by today’s standards with pilots as their master, controlling them with his/her cognitive and physical faculties. With time, aircraft became more technologically advanced and increasingly took over tasks previously conducted by pilots. In 1912, the first autopilots emerged that enabled aircraft to maintain altitude and direction without any intervention from pilots. During the 1920s, gyroscopically based instruments were introduced that were far superior in maintaining spatial orientation under low-visibility conditions than the instincts of pilots (Schultz Citation2018, p. 6). With the advent of the Norden Bombsight during the Second World War, the computerized autopilot could autonomously fly the plane to the prime location based on the automatically measured conditions such as wind speed and release bombs over the target. The pilot only needed to activate the autopilot to perform these tasks (Allen and Chan Citation2017, p. 13).

In modern aircraft, the technology has become even more refined and has further marginalized the pilot’s role. Contemporary autopilots can takeoff, climb, cruise, descend, and land the aircraft without pilot involvement. Some aircraft can autonomously fly over rugged terrain at high speed in complete darkness. There are even automation systems that can take over the control of the aircraft from errant pilots to avoid collision (Schultz Citation2018, p. 6). Automatic Ground Collision Avoidance System (AGCAS) takes control of the aircraft when the pilot is incapacitated to avoid a crash. AGCAS allegedly saved a U.S. F-16 in Syria from meeting this fate (Scharre Citation2018a). Obviously, automation systems are prone to error as well. A 2013 study by the Federal Aviation Administration of civil aviation contended that unexpected or unexplained behavior of automated systems were found in 46 percent of the accident reports (Nakamura et al. Citation2013). With that said, the spread of automation is one of the main reasons civil aviation aircraft accident rate has fallen from approximately, four accidents per million flights in 1977, to less than 0.4 in 2018 (Wise Citation2019). Furthermore, a direct comparison reveals that in the early days of flight, approximately 80 percent of commercial airline accidents were caused by mechanical errors and the remaining 20 percent by human error. In 2003, these numbers were reversed (Rankin Citation2007, p. 16). In short, although machines are not immune to errors, they have become far more reliable than humans.

Generally, modern autopilots fly with greater precision than the best pilots. Moreover, modern electronic equipment has the capacity to detect enemy aircraft long before the pilots’ naked eye, and acts as his/her visual cue. In aerial warfare, modern software and satellites can ideally guide bombs to within inches of their target. The world’s most expensive fighter, the F-35, reportedly has an active electronically scanned array radar that automatically assesses targets at all ranges without pilot input. Its cameras, sensors, and radar provide data that is often highly processed and prioritized before reaching the pilot’s senses. Under some conditions, the F-35 (like most other modern combat aircraft) can take action before the pilot even manages to react (Schultz Citation2018, pp. 1, 169, 176). In fact, the fighters’ software is said to prevent the pilot from inadvertently putting the plane into unrecoverable spins and other aerodynamically unstable conditions (Scharre Citation2018a).

With the advent of UAVs or drones, pilots have even been relegated from the cockpit. In the US, the demand for UAV pilots reportedly rose by 76 percent in the period 2013–2018, going from 1,366 to 2,404 (Vandiver Citation2019). Similar developments are to be expected in other nations, as nine countries have already employed armed UAVs in combat, and at least 20 countries are currently developing lethal drone programs (Zegart Citation2018, p. 1–2). There are numerous advantages to removing pilots from the aircraft. Firstly, human limitations in coping with excessive G-force and altitude are no longer a factor. Secondly and relatedly, without onboard pilots, drones can usually stay airborne longer than a manned aircraft. For instance, the manned surveillance airplane U-2 can remain airborne for up to 12 hours and presents a formidable challenge to the pilot who reportedly feel “completely wiped” after the flight (Fisher Citation2010), while its unmanned counterpart, the RQ-4 Global Hawk Block 40, has successfully completed a 34.3 hour flight (“RQ-4 Global Hawk,” Citation2014).

Thirdly, pilots are less vulnerable with UAVs as drones can be operated from a remote location. For instance, between 2002 and mid-2015, the United States conducted approximately 568 drone strikes without a single pilot casualty (Zegart Citation2018, p. 14). During the same period, at least five American onboard pilots reportedly lost their lives in Operation Enduring Freedom alone (“Army Chief Warrant Officer 3 William T. Flanigan,” Citation2006; “Enduring Freedom Casualties – Special Reports,” Citation2008; “Michael Slebodnik, Chief Warrant Officer, United States Army,”, Citation2008, Faraj Citation2005, O’Brien Citation2013). These losses are not only costly, but have in addition a negative impact on maintaining public support for the war. Indeed, studies demonstrate a strong and direct correlation between rising casualties and declining support in US public opinion (Gartner Citation2008).

Fourthly, without onboard pilots, equipment such as the cockpit, armor, ejection seat and flight controls are no longer required. As a result, UAVs can be constructed lighter and smaller. The space taken up by pilots in manned aircraft can also be utilized for the installation of technical equipment that can in turn perform more autonomous tasks, which further marginalize the pilots function.Footnote7 This is evident in the Air Force Global Hawk and Army Gray Eagle drones that fly autonomously once pilots have set their designated location. In fact, the army do not even refer to these human controllers as pilots but as “operators”. Yet, these human operators are still essential as these UAVs are only capable of acting autonomously in the simplest of missions (Scharre Citation2018a).

The unmanned carrier-launched airborne surveillance and strike aircraft, X-47B, has demonstrated a more extensive autonomous capability. In July 2013, the X-47B performed one of the most difficult tasks pilots can be confronted with, landing on an aircraft carrier. The X-47B can do so day or night and successfully touchdown every single time (Schultz Citation2018, p. 155). Pilots, on the other hand, require extensive training before they are able to land on a moving carrier. They must touch down with precision and timing so that the tail hook catches the arresting wire and brings them to a stop, before the short runway runs out. Pilots have died when attempting to complete this difficult landing (Skaine Citation1999, pp. 39–42). In addition to its landing ability, the X-47B successfully conducted the first fully autonomous aerial refueling in 2015. This capability spares the pilot of dangers associated with getting close to another aircraft and performing tricky maneuvers to refuel (Piesing Citation2016).

The military development and testing of swarming drones have also begun whereby numerous small, inexpensive, cooperative and unmanned aircraft coordinate their actions and fight as a coherent unit to overwhelm the enemy. Swarms can perform offensive, defensive and supportive functions such as intelligence, surveillance, and reconnaissance missions, and enable the military to field forces that are cheaper, larger, faster and better coordinated. In October 2016, the US Defense Department launched a swarm of 103 Perdix drones flying in formation and demonstrating autonomous collective decision-making. Scholars speculate on the possibility of far larger swarms in the future. For the purposes of this article, it is important to note that in contrast to individually operated UAVs, a single pilot can control an entire swarm since they maintain their formation automatically (Lachow Citation2017, Scharre Citation2018b). If this is so, the number of pilots needed will decrease.

As this investigation illustrates, pilots have gone from being the masters of their aircraft to becoming their system managers. The tasks that machines have taken control of in planes have only increased over time due to their superior performance. As a result, contemporary military pilots spend considerable time controlling the automatic flight control systems, and act as safety observers overseeing sensors and other mechanical components rather than exerting direct control (Schultz Citation2018, p. 172). More specifically, they have to manage internal and external subsystems to optimize the aircraft functions; “remain alert to environmental factors such as weather, obstacles, and enemy threats; and communicate effectively with other aircrew members and supporting agents, including air traffic control, ground-based commanders, and command and control aircraft” (Schultz Citation2018, p. 165). Pilots remain after all more versatile than machines. Currently, both pilots and machines are therefore required to compensate for each other’s weaknesses, and optimize system performance as a whole. However, since technological development is far faster than human evolution, it is highly probable that machines will eventually outperform pilots to such an extent that this occupation will become redundant.

The end of pilots? Technological, military, political and institutional issues

To replace pilots, it is evident that PAVs must possess sufficient autonomy and intelligence. The real technological challenge in developing PAVs lies in making them adequately intelligent. Lethal autonomous weapons systems (LAWS) such as the Tomahawk Anti-Ship Missile (TASM) were already available in the 1980s. Yet, the fully operational TASM was never launched, since it was incapable of accurately distinguishing between enemy ships and merchant vessels. In other words, TASM possessed sufficient autonomy but insufficient AI to be used in practice (Scharre Citation2018a). Sceptics may therefore point out that AI technology is still in its infancy and the mathematical possibilities too complex for AI to understand context, choose between conflicting goals, handle new situations and interpret meaning (Ayoub and Payne Citation2016, p. 816, Payne Citation2018, p. 10).

These objections fail to recognize the advantages and potential of AI and technology. Algorithms are after all already “demonstrating an increasing capacity to learn without supervision, with limited data for training, and to cope with ambiguous and asymmetric information” (Payne Citation2018, p. 8). With hybrid AI, systems can learn in a humanlike way by combining two rival AI approaches – Neural pattern recognition and symbolism. In doing so, the key limitations of each approach are overcome and neural pattern recognition allows the system to “see” and symbolism enables it to “reason” (Mao et al. Citation2019). Through networked computer agents, the AI system could also learn by studying its own activities or that of other agents in their network and make inferences on the basis of far greater (and confusing) data (Arkin Citation2010, p. 333, Levine et al. Citation2016, Payne Citation2018, p. 9). Moreover, AI coupled with Natural Language Processing can already interpret some types of meaning in enormous quantities of texts far faster and with greater accuracy than a human being (Kruger Citation2019). With advances in robotics, sensors and increasingly powerful hardware, enhanced AI performance in all of these aforementioned areas should therefore be possible, with operations conducted at staggering speeds that surpasses that of the pilot.

Remarkably, an AI system dubbed ALPHA has repeatedly defeated a human pilot, namely, the retired US Air Force Colonel Gene Lee, in multiple flight simulator trials. ALPHA managed to shoot down Lee on each protracted simulated engagement. Lee did not manage to score a single kill. ALPHA’s main advantage compared to that of human pilots lies in its ability to collect and process the enormous amount of data from the aircraft’s sensors and make extremely rapid decisions on how to best respond in the given situation. Humans with an average visual reaction time of between 0.15 and 0.30 seconds who need even longer time to think of optimal plans can simply not emulate ALPHA’s performance. Once trained, ALPHA can beat human pilots even while running on cheap computers or smartphones (Ernest et al. Citation2016).

Considering that the technological development is far faster than human evolution, it is highly probable that PAVs will eventually outperform human-piloted aircraft in actual combat. In airpower theorist John Boyd’s OODA loop, the goal is to complete the four steps: (1) observe, (2) orient, (3) decide and (4) act, faster than the adversary to create confusion and disorder in his/her mind and attain victory (Osinga Citation2007). If success is achieved by completing the OODA-loop faster than the antagonist, automation and AI will most likely eventually defeat any human pilot in combat. Through iterative learning, autonomous AI could develop skills and capacities that exceed what is humanly possible. They could learn to fly high-G maneuvers human pilots can only dream of with much shorter reaction times and superior decision-making capacities (Altmann and Sauer Citation2017, p. 123). With advances in computer speed and AI, the gap between the speed and accuracy machines can complete the OODA-loop compared to that of the pilots in non-complex environments will likely only increase.

Militarily, there is little point in retaining pilots, if PAVs are superior. Those who continue to rely on pilots may after all risk facing defeat due to their relatively inferior fighting capacity, an outcome countries seek to avoid. When the Prussian Armed Forces realized how to exploit new technologies such as railroads, rifles, and the telegraph to project power rapidly in the mid-19th century, other nations followed suit (Showalter Citation1975, Herrera and Mahnken Citation2003). There is no evidence to suggest that the situation would be different with autonomy and AI. The Russian President Vladimir Putin has already indicated that this will be the case when he publicly stated that the leader in AI will “become the ruler of the world” (Gigova Citation2017). Nations who face demographic and security challenges will find such a solution appealing as it would enable them to project airpower without pilot dependency. The United States Department of Defense (DoD) has even stated that contemporary ”automation is too dependent on human beings” and that it “must continue to pursue technologies and policies that introduce a higher degree of autonomy to reduce the manpower burden” (“Unmanned systems integrated roadmap FY2011-2036,” Citation2011, p. vi). The DoD also explicitly acknowledge that research and development are heading towards systems with greater autonomy capable of making decisions and reacting without human input (“Unmanned systems integrated roadmap FY2013-2038,” Citation2013, p. 68).

At this point, it could be argued that since pilots typically have a high standing in the armed forces, they would be in a position to halt the development of PAVs as they undermine and threaten their occupation. According to Horowitz (Citation2018, p. 48), the US Armed Forces for instance failed to fund the autonomous X-47B drone mentioned earlier in this paper due to bureaucratic resistance. Whilst it is true that pilots are powerful actors, their role has diminished over the years, without them being able to stop this process. Despite, some projects such as the X-47B drone being abandoned due to pressures, PAVs will most likely be developed as long as the overriding political and institutional objective to command the air as effectively as possible whilst minimizing costs (politically, financially and in terms of soldiers’ lives lost) is paramount. There is little pilots can do to hinder this historical trend.

The US Air Force chief scientist, Greg Zacharias’, testimony to congress bears witness to the bleak future pilots will likely face. Zacharias (Citation2015) noted that the goal is to integrate intelligent machines with humans and thereby maximize “mission performance in complex and contested environments.” While Zacharias envisions a central role for humans in this scenario, he interestingly never mentions pilots specifically. Similarly, when media outlets reported that the RAF’s new AI drone possessed the capacity to attack targets autonomously, a spokesperson from the Ministry of Defence simply remarked that the employment of weapons will be under human control with no reference to pilots (Allison Citation2016).

This comes as no surprise given that existing aircraft require vast numbers of costly pilots who are in need of extensive training. This is perhaps partly why the Israelis developed the operational Harpy kamikaze-drone in the 1990’s. The Harpy carries explosives in its nose and attacks radar systems by self-destructing into them and has been purchased by Chile, China, India, South Korea, and Turkey (Gertz Citation2002). Humans are only required to launch the Harpy, as it can search, detect and engage targets fully autonomously, without pilot involvement.

The end of pilots? Economic, legal and ethical issues

In terms of financial costs, Cummings (Citation2017, p. 10) maintains that only a small proportion of defense research & development money is spent on autonomous systems, whereas large sums are spent on traditional systems. The world also experienced two major “AI winters” in the 1970’s and 1980’s, where funding and interest in this technology declined (Dickson Citation2018).Footnote8 On basis of these facts, it could therefore be argued that the willingness to fund the expensive development of PAVs is lacking. Although it is true that military spending on autonomy and AI are relatively small, such an interpretation would be wrongheaded as these technologies are growing in importance. In 2018, autonomy and robotics were Pentagon’s top priorities in the national defense strategy. The following year, the DoD requested a 28 percent increase for the development of these technologies with a planned spending of almost $7 billion in 2019 on unmanned aircraft, which are a vital stepping stone towards the development of PAVs (Harper Citation2018). According to Siemens, the global military spending on robotics that can replace and replicate human actions increased from $5.1 billion in 2010 to $7.5 billion in 2015 (Getting to grips with military robotics Citation2018). Furthermore, the world is arguably experiencing an AI arms race with major players such as China and the United States increasing spending on military AI (Allen, Citation2019). It is possible that another AI winter may come about, however, that will almost certainly not be enough to stop the advancement. As researchers have rightly pointed out, major progress occurred even during AI winters (Kurzweil Citation2005, pp. 263–264).

Critics may also call attention to the point that PAVs are not covered by existing international law and this may impede their development. This objection fails to recognize that only an extremely limited number of weapons such as chemical weapons are governed by specific international treaty rules. Other weapons, even if not yet developed, are subject to the Law of Armed Conflict (LOAC) (Anderson and Waxman Citation2017). Provided that PAVs satisfy the four principles of LOAC, namely, military necessity, distinction, proportionality and unnecessary suffering they are thus essentially legal.Footnote9 There is no inherent reason why PAVs would fail to satisfy these demands. If programmed correctly to comply with international law, PAVs could act more lawful than piloted aircraft.Footnote10 This development may, however, take some time since algorithms are “only optimal for well-understood or modeled situations” (Gilli and Gilli Citation2016, p. 79).

Another legal issue associated with PAVs is the concern regarding accountability and responsibility. Who is to be held accountable if something goes wrong? Although there is no consensus on this issue among legal experts, the Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS) chaired by India in 2017, established that the “responsibility for the deployment of any weapons system in armed conflict remains with states” and they “must ensure accountability for lethal action” (Kane Citation2018, p. 6). As illustrated, numerous states have been willing to invest in autonomous military technology and legal responsibility is thus highly unlikely to impede any efforts in the development of PAVs.

As a counter-argument, one could insist that the campaign to stop LAWS or “killer robots” may lead to a legal ban of PAVs. Global polls do after all indicate that the majority of the population are against the employment of LAWS along with 28 countries and 100 non-governmental organizations (“Killer Robots,” Citation2019). The movement against killer robots has had some success as well. When it was revealed that Google was involved in the US Department of Defense-funded Project Maven that sought to autonomously process video footage shot by surveillance drones, over 4,000 Google employees protested. As a result, the company announced its decision to stop developing AI for weapons use (“Rise of the tech workers,” Citation2019). Yet, this is only a minor victory, as Russia, Israel, South Korea, and the United States have all indicated that they do not support negotiations for a new treaty banning killer robots (Human Rights Watch Citation2019).

Regardless of the development or employment of PAVs was to be made illegal; it is unclear whether such a ban would be respected in a world where there is no efficient global enforcement mechanism, with the power to compel actors to abide by this law. This is especially true when breaking the law may prove extremely advantageous. For example, an international treaty from 1899 banned the use of weaponized aircraft for fear of aerial bombing (Allen and Chan Citation2017, p. 3). As history has shown, this voluntary treaty was disregarded in both world wars.

The final barrier against the development of PAVs is ethical. It could be argued that pilots are needed as human moral agents since the internal logic of AI is not well-understood and PAVs must be capable of contextual decision-making in a dynamic and non-deterministic manner (Gilli and Gilli Citation2016, pp. 77–79, Payne Citation2018, p. 11), to make informed decisions and minimize collateral damage. It is currently impossible to verify whether PAVs will be able to match or surpass piloted aircraft in satisfying the ethical imperative of minimizing collateral damage, as it is an empirical question for the future. There are however reasons to believe that they will likely be able to match, or even exceed piloted aircraft in some situations.

Presently, AI can identify objects such as aircraft with on-board multi-spectral imaging (Hammes Citation2016). AI has already demonstrated its ability to recognize and categorize some images better than human beings in various tests. However, weaknesses were also exposed when more context, backstory, or proportional relationships were necessary (Tanz Citation2017). This barrier may be overcome as researchers have for instance developed a new AI computer vision system that mimics human’s visualization and identification of objects (Chen et al. Citation2019). In the future, PAVs equipped with a superior sensory apparatus, Global Information Grid, and devoid of psychological dispositions for revenge, may potentially make for better battlefield observations, process extensive information faster and more accurately than pilots, and could avoid immoral actions due to psychological influences (See, Arkin Citation2010, pp. 333–334).

Subsequently, these abilities could help PAVs distinguish between civilians and combatants more effectively, and keep collateral damage lower than the average pilot. Accomplishing this task is of course easier if the target is mechanized and combat is confined to an environment with few civilians rather than that of an urban setting. There are target identification systems that are already capable of reliably distinguishing between military objects (such as tanks and mechanized artillery, etc., that often have silhouettes, radar, and infrared signatures) and civilian objects (such as regular cars, trucks and merchant ships, etc.) (Sparrow Citation2016, pp. 102–103). According to the UK Ministry of Defence (MoD), UAVs will be able to “independently locate and attack mobile targets, with appropriate proportionality and discrimination” by 2030 (Doward Citation2018).

Within crowded, complex environments, on the other hand, PAVs will have a much more difficult time to differentiate between legitimate and illegitimate targets. For instance, it would be extremely difficult to determine whether the person carrying arms in a visually cluttered environment is an enemy or an ally. As the UK MoD (Citation2018, p. 54) stated in a publication: “the last roles likely to be automated will be where personnel conduct activities that demand contextual assessment and agile versatility in complex, cluttered and congested operating areas”. Perhaps nothing short of Artificial General Intelligence (AGI) with the capacity to understand, or learn any intellectual task that a human being can, would be suitable for such demanding tasks. Yet, progress in developing AGI has been slow and some scholars even argue against the possibility of AGI (Melnyk Citation1996). However, the argument put forward in this article does not rely upon the development of AGI, as it merely suggests that AI and autonomy will likely make military pilots redundant in the future. If PAVs are unable to make ethical targeting decisions under demanding decisions, they could therefore be designed to await human instruction when they encounter such difficulties. This is similar to when a human pilot receives an order on what course of action to pursue through communication from the ground. Under such conditions, the PAV will thus act semi-autonomously and receive human advice on how to target ethically. Hence, humans may remain in the loop when PAVs perform missions, but they will not be pilots. Elaborate human-machine teaming is expected by the UK MoD (Citation2018) in the future.

Conclusion

This article has analyzed the evolving role of pilots in the armed forces. In doing so, it has highlighted a number of issues in recruiting pilots with the “right stuff”, the costly and lengthy nature of their training and the difficulties of retaining them once they have joined the services. Such factors have contributed to pilot shortages in numerous countries. In addition, pilots have physiological and psychological limitations, are susceptible to error and eventually overcome by fatigue and hunger. Tasks previously performed by pilots have therefore increasingly been assumed by machines with a proven superior capacity in these areas. As a result, pilots have been reduced to the system managers of aircraft.

Developments in AI, autonomy and other technology indicates that this process will only intensify over time. Indeed, an AI system named ALPHA has already defeated an experienced military pilot in flight simulator trials and the X-47B drone has demonstrated an ability to complete aircraft carrier landings and aerial refueling autonomously, tasks that most pilots find challenging. Since technological progress occur at a far quicker rate than human evolution, PAVs that render the pilot obsolete will likely be developed in the future if the overriding objective to command the air as effectively as possible whilst minimizing costs persists. The development of PAVs is expected to be fraught with technological challenges, bureaucratic resistance and met with legal and ethical objections. Yet, none of these issues pose an insurmountable barrier, as this study has indicated. In light of current trends, only black swan events may reasonably halt the emergence of PAVs. The central challenge for PAVs will probably be operations in complex and crowded environments, where human, but not pilot involvement, may be required in the absence of AGI. It will also likely take considerable time before even semi-autonomous PAVs dominate the skies. The day that happens, it will mark the end of pilots.

Acknowledgments

I thank Stefan Borg, Rickard Lindborg, Mike Palmer, Dan Öberg and the anonymous reviewers for their helpful suggestions.

Disclosure statement

No potential conflict of interest was reported by the author.

Additional information

Notes on contributors

Arash Heydarian Pashakhanlou

Arash Heydarian Pashakhanlou is an Assistant Professor in War Studies at the Swedish Defence University. His work has appeared in the journals International Relations, International Politics, Journal of International Political Theory and The Washington Quarterly, among others. Palgrave published his latest monograph Realism and Fear in International Relations: Morgenthau, Waltz and Mearsheimer Reconsidered.

Notes

1. Pilot refers to those who control the flight of an aircraft, either from the cockpit or a remote location. They must have undergone training and received the appropriate pilot certifications to qualify as such. In this study, the main focus is on military pilots and especially fighter pilots.

2. Flying ace is a military pilot that has shot down several opponents, usually five or more in aerial combat (Robertson Citation2003).

3. In this context, autonomy refers to the degree that aircraft can search, locate and engage targets on their own. When an aircraft can complete a task without human assistance, it is ‘fully autonomous’, in this respect. Should it require human aid, it is ‘semi-autonomous’. AI refers to the intelligence the aircraft possesses. Intelligence “reflects a broader and deeper capability for comprehending … surroundings-‘catching on,’ ‘making sense’ of things, or ‘figuring out’ what to do” (Gottfredson Citation1997, p. 13).

4. In this article, the terms UAVs and drones are used interchangeably and refer to aerial vehicles that do not have an onboard human pilot. As such, this definition includes so-called suicide or kamikaze drones with built-in warheads that attack targets by self-destructing into them.

5. Extraversion is typically manifested in outgoing, assertive and energetic behavior. Conscientiousness is associated with being hard-working, reliable, efficient, organized, self-disciplined and striving for achievement. Individuals with low levels of neuroticism are more emotionally stable, calm and even-tempered (Barrick and Mount Citation1991).

6. It should be noted that pilot flying time in Russia is measured differently than in the United States. The Russians only consider the time that the aircraft is airborne whereas the US Air Force start calculating from the moment the aircraft moves on the ground by its own power to the moment it comes to a complete halt upon landing. Moreover, the average flight hours for Russian pilots had reportedly increased to about 120–125 hours in 2016 (Sutyagin Citation2018, pp. 320–321).

7. There are nonetheless a number of weakness with contemporary drones. Current UAVs such as the MQ-1 Predator and MQ-9 Reaper lack defensive capabilities have limited maneuverability and fly slowly. They are deemed useless in contested environments and susceptible to jamming and hacking (Kaag and Kreps Citation2014, Zegart Citation2018, p. 5).

8. There are different accounts as to how many AI winters there have been, when they took place and why they occurred. The causes of AI winters have been attributed to the overhyping of AI research that did not live up to promises and expectations, lack of practical applications for AI research as well as institutional and economic factors, etc. (Boobier Citation2018, p. 42).

9. Military necessity prohibits acts that are not essential from a military point of view. Distinction suggests that military operations may only be directed against combatants and specific military objectives. Proportionality specifies that attacks harming civilians or civilian objects may not exceed the concrete military advantage anticipated by such attacks. Unnecessary suffering seeks to reduce and alleviate human suffering in war (Bourbonnière Citation2004).

10. AI has already been used in jurisdiction. Some states in the US employ AI that recommend criminal sentences. The Estonian Ministry of Justice is currently planning to launch an AI-system to settle small claims disputes of less than €7,000 (about $8,000) (Niiler Citation2019).

References