4,757
Views
10
CrossRef citations to date
0
Altmetric
Original Articles

Predicting Peril or the Peril of Prediction? Assessing the Risk of CBRN Terrorism

Pages 501-520 | Published online: 09 Aug 2011
 

Abstract

Since the mid-1990s, academic and policy communities have debated the risk posed by terrorist use of chemical, biological, radiological, or nuclear (CBRN) weapons. Three major schools of thought in the debate have emerged: the optimists, the pessimists, and the pragmatists. Although these three schools of thought draw on the same limited universe of data on CBRN terrorism, they arrive at strikingly different conclusions. Given the highly subjective process of CBRN terrorism risk assessment, this article analyzes the influence of mental shortcuts (called heuristics) and the systemic errors they create (called biases) on the risk assessment process. This article identifies and provides illustrative examples of a range of heuristics and biases that lead to the underestimation of risks, the overestimation of risks and, most importantly, those that degrade the quality of the debate about the level of risk. While these types of biases are commonly seen as affecting the public's perception of risk, such biases can also be found in risk assessments by experts. The article concludes with recommendations for improving the CBRN risk assessment process.

Notes

Commission on the Prevention of Weapons of Mass Destruction Proliferation and Terrorism, World at Risk (New York: Vintage, 2008), xv.

Richard G. Lugar, The Lugar Survey on Proliferation Threats and Responses (Washington, DC: Richard G. Lugar, 2005).

National Research Council, A Survey of Attitudes and Actions on Dual Use Research in the Life Sciences (Washington, DC: National Academies Press, 2009), 74.

Commission on the Prevention of Weapons of Mass Destruction Proliferation and Terrorism, World at Risk, xv.

Amy Smithson, The Biological Weapons Threat and Nonproliferation Options: A Survey of Senior U.S. Decision Makers and Policy Shapers (Washington, DC: Center for Strategic and International Studies, 2006), 12.

David E. Kaplan, “Aum Shinrikyo (1995),” in Jonathan B. Tucker, ed., Toxic Terror: Assessing Terrorist Use of Chemical and Biological Weapons (Cambridge: MIT Press, 2000), 207–226.

Jessica E. Stern, “Larry Wayne Harris (1998),” in Tucker, Toxic Terror (see note 6 above), 227–246.

Thomas A. Birkland, After Disaster: Agenda Setting, Public Policy, and Focusing Events (Washington, DC: Georgetown University Press, 2007), 23–26, 131–143.

Henry H. Willis, Andrew R. Morral, Terrence K. Kelly, and Jamison Jo Medby, Estimating Terrorist Risk (Santa Monica, CA: RAND, 2005), 6–11.

For useful reviews of the literature on this topic, see Chris Dishman, “Understanding Perspectives on WMD and Why They Are Important,” Terrorism and Political Violence 24, no. 4 (2001): 303–313; Adam Dolnik, “13 Years since Tokyo: Re-Visiting the ‘Superterrorism’ Debate,” Perspectives on Terrorism 2, no. 2 (2008): 3–11; and Todd Masse, “Nuclear Terrorism Redux: Conventionalists, Skeptics, and the Margin of Safety,” Orbis 52, no. 2 (2010): 302–319.

These schools of thought are ideal types that do not correspond strictly to the writings or policy preferences of any single scholar or policy-maker. Indeed, the same academic or official might make arguments consistent with different schools of thought at different times.

Examples of optimists include Brian Michael Jenkins, Ehud Spriznak, Milton Leitenberg, John Mueller, and Robin Frost.

Examples of pessimists include Richard Falkenrath, Ashton Carter, Richard Danzig, Tara O'Toole, and Graham Allison.

Examples of pragmatists include Jessica Stern, John Parachini, Jonathan Tucker, Jean Pascal Zanders, the Gilmore Commission, and Bruce Hoffman.

Lynn Eden, Whole World on Fire: Organizations, Knowledge and Nuclear Weapons Destruction (Ithaca, NY: Cornell University Press, 2004), 37–60.

Baruch Fischoff, “For Those Condemned to Study the Past: Heuristics and Biases in Hindsight,” in Daniel Kahneman, Paul Slovic, and Amos Tversky, eds., Judgment Under Uncertainty: Heuristics and Biases (Cambridge: Cambridge University Press, 1982), 335–351.

National Commission on Terrorist Attacks Upon the United States, The 9/11 Commission Report (New York: W.W. Norton, 2004), 355–356.

Roberta Wohlstetter, Pearl Harbor: Warning and Decision (Stanford, CA: Stanford University Press, 1962), 387.

Richard Betts, “Analysis, War and Decision: Why Intelligence Failures are Inevitable,” World Politics 31, no. 1 (1978): 61–89.

Amos Tversky and Daniel Kahneman, “Availability: A Heuristic For Judging Frequency and Probability,” in Kahneman, Slovic and Tversky, Judgment Under Uncertainty (see note 16 above), 163–178.

Steven J. Sherman, Robert B. Cialdini, Donna F. Schwartzman, and Kim D. Reynolds, “Imagining Can Heighten or Lower the Perceived Likelihood of Contracting a Diseases: The Mediating Effect of Ease of Imagery,” in Thomas Gilovich, Dale Griffin and Daniel Kahneman, eds., Heuristics and Biases: The Psychology of Intuitive Judgment (Cambridge: Cambridge University Press, 2002), 98.

Thomas Schelling, foreword to Wohlstetter, Pearl Harbor (see note 18 above), vi.

Nassim Nicholas Taleb, The Black Swan: The Impact of the Highly Improbable (New York: Random House, 2007), xvii–xviii.

National Commission on Terrorist Attacks Upon the United States, The 9/11 Commission Report, 344–348.

Two exceptions are a study by SAIC in 1999 and an experiment conducted by the Canadian military in 2001. See Guy Gugliotta and Dan Eggen, “Biological Warfare Experts Questioned in Anthrax Probe,” Washington Post, June 28, 2002; and B. Kournikakis, S.J. Armour, C.A. Boulet, M. Spence, and B. Parsons, Risk Assessment of Anthrax Threat Letters (Suffield, Canada: Defence Research Establishment Suffield, 2001).

Kristen Lundberg, “The Anthrax Crisis and the U.S. Postal Service (A): Charting a Course in a Storm,” in Arnold M. Howitt and Herman B. Leonard, eds., Managing Crises: Responses to Large-Scale Emergencies (Washington, DC: CQ Press, 2009), 337–356.

Bill Keller, “Nuclear Nightmares,” New York Times, May 26, 2002.

Ron Suskind, The One Percent Doctrine (New York: Simon and Schuster, 2006), 30.

Shelley E. Taylor, “The Availability Bias in Social Perception and Interaction,” in Kahneman, Slovic, and Tversky, Judgment Under Uncertainty (see note 16 above), 192; and Sherman, et al., “Imagining Can Heighten or Lower the Perceived Likelihood of Contracting a Disease” (see note 21 above), 98–102.

Nick Pidgeon, Roger E. Kasperson, and Paul Slovic, eds., The Social Amplification of Risk (Cambridge: Cambridge University Press, 2003).

Karen Frost, Erica Frank, and Edward Maibach, “Relative Risk in the News Media: A Quantification of Misrepresentation,” American Journal of Public Health 87, no. 5 (1997): 842–845; and Barbara Coombs and Paul Slovic, “Newspaper Coverage of Causes of Death,” Journalism Quarterly 56, no. 4 (1979): 837–843.

Charles Piddock, Outbreak: Science Seeking Safeguards for Global Health (New York: National Geographic Books, 2008), 49.

In 2007, 2 million died from HIV/AIDS, 59 died from avian influenza, and none died from biological terrorism. Joint United Nations Programme on HIV/AIDS, Report on the Global AIDS Epidemic (New York: United Nations, 2008), 30; and World Health Organization, “Cumulative Number of Confirmed Human Cases of Avian Influenza A/(H5N1) Reported to WHO,” 6 May 2010, accessed at http://www.who.int/csr/disease/avian_influenza/country/cases_table_2010_05_06/en/index.html

Meredith E. Young, Geoffrey R. Norman, and Karin R. Humphreys, “Medicine in the Popular Press: The Influence of the Media on Perceptions of Disease,” PLoS One 3, no. 10 (2008): 1–7.

George Tenet with Bill Harlow, At the Center of the Storm: My Years at the CIA (New York: HarperCollins, 2007), 231.

George W. Bush, Decision Points (New York: Crown, 2010), 153.

Jane Mayer, The Dark Side: The Inside Story of How the War on Terror Turned Into a War on American Ideals (New York: Anchor, 2009), 5.

Tenet, At the Center of the Storm (see note 35 above), 232.

Ibid.

Mayer, The Dark Side (see note 37 above), 5.

Eliezer Yudkowsky, “Cognitive Biases Potentially Affecting Judgment of Global Risks,” in Nick Bostrom and Milan M. Cirkovic, eds., Global Catastrophic Risks (Oxford: Oxford University Press, 2008), 114.

Daniel Kahneman and Amos Tversky, “The Simulation Heuristic,” in Kahneman, Slovic, and Tversky, Judgment Under Uncertainty (see note 16 above), 207.

Amos Tversky and Daniel Kahneman, “Extensional Versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment,” in Gilovich, Griffin, and Kahneman, Heuristics and Biases (see note 21 above), 39–41.

Yudkowsky, “Cognitive Biases Potentially Affecting Judgment of Global Risks” (see note 41 above), 97 (emphasis in the original).

Ibid., 103.

William Jefferson Clinton, My Life (New York: Knopf, 2004), 788.

Judith Miller, Stephen Engelberg, and William Broad, Germs: Biological Weapons and America's Secret War (New York: Simon and Schuster, 2001), 225.

Richard Preston, The Cobra Event (New York: Ballantine, 1997), 419.

Miller, Engelberg, and Broad, Germs (see note 47 above), 226, 232–238.

Gregory D. Koblentz, “Biological Terrorism: Understanding the Threat and America's Response,” in Arnold Howitt and Robyn Pangi, eds., Countering Terrorism: Dimensions of Preparedness (Cambridge: MIT Press, 2003), 118.

Daniel Kahneman and Amos Tversky, “Subjective Probability: A Judgment of Representativeness,” in Kahneman, Slovic and Tversky, Judgment Under Uncertainty (see note 16 above), 47.

Michael Green, Terrorism Prevention and Preparedness: New Approaches to U.S.-Japan Security Cooperation (New York: Japan Society, 2001), 20–21.

Amos Tversky and Daniel Kahneman, “Judgment Under Uncertainty: Heuristics and Biases,” in Kahneman, Slovic, and Tversky, Judgment Under Uncertainty (see note 16 above), 15.

Bruce Hoffman, Inside Terrorism (New York: Columbia University Press, 2006), 254.

Michael Levi, On Nuclear Terrorism (Cambridge: Harvard University Press, 2007), 7.

Paul Slovic, Melissa Finucane, Ellen Peters, and Donald G. MacGregor, “The Affect Heuristic,” in Gilovich, Griffin, and Kahneman, Heuristics and Biases (see note 21 above), 410–418.

Robin Gregory, James Flynn, and Paul Slovic, “Technological Stigma,” in James Flynn, Paul Slovic and Howard Kunreuther, eds., Risk, Media and Stigma: Understanding Public Challenges to Modern Science and Technology (London: Earthscan, 2001), 3.

Yudkowsky, “Cognitive Biases Potentially Affecting Judgment of Global Risks” (see note 41 above), 105.

See, for example, Institute of Medicine and National Research Council, Globalization, Biosecurity and the Future of the Life Sciences (Washington, DC: National Academies Press, 2006).

Gaymon Bennett, Nils Gilman, Anthony Stavrianakis and Paul Rabinow, “From Synthetic Biology to Biohacking: Are We Prepared?” Nature Biotechnology 27, no. 12 (2009): 1109–1111.

Paul Slovic, “Perception of Risk,” Science 236 (17 April 1987): 282–283.

Jessica Stern, “Dreaded Risk and the Control of Biological Weapons,” International Security 27, no. 3 (2002/03): 89–123.

Brian Michael Jenkins, Will Terrorist Go Nuclear? (Amherst, NY: Prometheus Books, 2008), 25–26.

Ibid., 30.

Yudkowsky, “Cognitive Biases Potentially Affecting Judgment of Global Risks” (see note 41 above), 99.

Robert Jervis, “Reports, Politics, and Intelligence Failures: The Case of Iraq,” Journal of Strategic Studies 29, no. 1 (2006): 20–27.

Gregory D. Koblentz, Living Weapons: Biological Warfare and International Security (Ithaca, NY: Cornell University Press, 2009), 186–187.

Paul Slovic, Baruch Fischoff, and Sarah Lichenstein, “Facts Versus Fears: Understanding Perceived Risk,” in Kahneman, Slovic and Tversky, Judgment Under Uncertainty (see note 16 above), 472.

Marc Alpert and Howard Raiffa, “A Progress Report on the Training of Probability Assessors,” in Kahneman, Slovic and Tversky, Judgment Under Uncertainty (see note 16 above), 294–305.

National Research Council, Department of Homeland Security Bioterrorism Risk Assessment: A Call for Change (Washington, DC: National Academies Press, 2008), 26, 124.

Ibid.

Brian Michael Jenkins, The Potential for Nuclear Terrorism (Santa Monica, CA: RAND, 1977), 7–8.

Masse, “Nuclear Terrorism Redux” (see note 10 above), 302–319.

Dishman, “Understanding Perspectives on WMD and Why They Are Important” (see note 10 above), 303–313.

Jessica Stern, “Terrorist Motivations and Unconventional Weapons,” in Peter Lavoy, Scott Sagan, and James Wirtz, eds., Planning the Unthinkable: How New Powers Will Use Nuclear, Biological and Chemical Weapons (Ithaca, NY: Cornell University Press, 2000), 202–229.

Noted exceptions are Bruce Hoffman who moved from the optimist to pragmatist camp and Jessica Stern who moved from the pessimist to pragmatist camp.

Jeffrey M. Bale and Gary A. Ackerman, “Profiling the WMD Terrorist Threat,” in Stephen Maurer, ed., WMD Terrorism: Science and Policy Choices (Cambridge: MIT Press, 2009), 38.

Tversky and Kahneman, “Judgment Under Uncertainty” (see note 53 above), 14–16.

Yudkowsky, “Cognitive Biases Potentially Affecting Judgment of Global Risks” (see note 41 above), 101–102.

Slovic, “Perception of Risk” (see note 61 above), 283.

Richard K. Betts, Enemies of Intelligence: Knowledge and Power in American National Security (New York: Columbia University Press, 2007), 31.

Slovic, Finucane, Peters, and MacGregor, “The Affect Heuristic” (see note 56 above), 412–413.

Amos Tversky and Daniel Kahneman, “Extensional Versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment,” in Gilovich, Griffin, and Kahneman, Heuristics and Biases, 29–30, 39–40.

Ibid., 30.

Slovic, Fischoff and Lichenstein, “Facts Versus Fears” (see note 68 above), 472–477; Derek J. Koehler, Lyle Brenner, and Dale Griffin, “The Calibration of Expert Judgment: Heuristics and Biases Beyond the Laboratory,” in Gilovich, Griffin, and Kahneman, Heuristics and Biases, 686–715; and Werner F.M. De Bondt and Richard H. Thaler, “Do Analysts Overreact?” in Gilovich, Griffin, and Kahneman, Heuristics and Biases, 678–685.

Richard J. Heuer, Jr., Psychology of Intelligence Analysis (Washington, DC: Government Printing Office, 2003).

Philip E. Tetlock, Expert Political Judgment: How Good Is It? How Can We Know? (Princeton: Princeton University Press, 2005), 20, 49–59.

Tetlock, Expert Political Judgment (see note 87 above), 54.

Matthew Bunn, “A Mathematical Model of the Risk of Nuclear Terrorism,” Annals of the Academy of Political and Social Science 607, no. 1 (2006): 103–120.

John Mueller, “The Atomic Terrorist: Assessing the Likelihood,” prepared for presentation at the Program on International Security Policy, University of Chicago, 15 January 2008, accessed at http://polisci.osu.edu/faculty/jmueller/APSACHGO.PDF

Ibid., 13–14.

National Research Council, Department of Homeland Security Bioterrorism Risk Assessment, 5.

Ibid., 3.

Gregory S. Parnell, Christopher M. Smith, and Frederick I. Moxley, “Intelligent Adversary Risk Analysis: A Bioterrorism Risk Management Model,” Risk Analysis 30, no. 1 (2010): 33–34.

Barry C. Ezell and Andrew J. Collins, “Response to Parnell, Smith, and Moxley, Intelligent Adversary Risk Analysis: A Bioterrorism Risk Management Model,” Risk Analysis 30, no. 1, (2010): 1.

Parnell, Smith, and Moxley, “Intelligent Adversary Risk Analysis” (see note 94 above), 44.

Brian Michael Jenkins, Will Terrorists Go Nuclear? (Santa Monica, CA: RAND, 1975), 3.

David C. Rapoport, “Terrorism and Weapons of the Apocalypse,” National Security Studies Quarterly 5, no. 3 (1999): 50.

JASON, Rare Events (McLean, VA: MITRE Corporation, 2007), 7.

Crystal Franco and Tara Kirk Sell, “Federal Agency Biodefense Funding, FY2010-FY2011,” Biosecurity and Bioterrorism 8, no. 2 (2010): 129–149.

Sydney Altman, et al., “An Open Letter to Elias Zerhouni,” Science (4 March 2005): 1409–1410.

Lynn C. Klotz and Edward J. Sylvester, Breeding Bio Insecurity: How U.S. Biodefense Is Exporting Fear, Globalizing Risk, and Making Us All Less Secure (Chicago: University of Chicago Press, 2009).

Stern, “Dreaded Risks and the Control of Biological Weapons” (see note 62 above), 91–92.

Office of Management and Budget, Budget of the U.S. Government, Fiscal Year 2009: Analytical Perspectives (Washington, DC: Office of Management and Budget, 2008), 28–29; and Office of the Press Secretary, “Department of Homeland Security Announces 6.8 Percent Increase in Fiscal Year 2009 Budget Request,” Department of Homeland Security Fact Sheet, 4 February 2008, accessed at http://www.dhs.gov/xnews/releases/pr_1202151112290.shtm

Graham Allison, Nuclear Terrorism: The Ultimate Preventable Catastrophe (New York: Times Books, 2004), 15.

Bob Graham and Jim Talent, “Bioterrorism: Redefining Prevention,” Biosecurity and Bioterrorism 7, no. 2 (2009): 1–2.

Franco and Sell, “Federal Agency Biodefense Funding” (see note 100 above), 134.

This has been shown true of miscalibration and overconfidence, but not of hindsight bias. Baruch Fischoff, “Debiasing,” in Kahneman, Slovic and Tversky, Judgment Under Uncertainty, 429–430, 437.

Roger Z. George and James B. Bruce, eds., Analyzing Intelligence: Origins, Obstacles and Innovations (Washington, DC: Georgetown University Press, 2008).

National Research Council, Field Evaluation in the Intelligence and Counterintelligence Context: Workshop Summary (Washington, DC: National Academies Press, 2010), 18–20.

The advantage held by foxes was broadly generalizable across regions, topics, and time. Tetlock, Expert Political Judgment (see note 87 above), 75–76.

Ibid., 73.

Ibid., 21.

Ibid., 237–238.

Adrian E. Tschoegl and J. Scott Armstrong, “Review of Philip E. Tetlock, Expert political judgment: How good is it? How can we know?” International Journal of Forecasting 23, no. 2 (2007): 339–342.

Additional information

Notes on contributors

Gregory D. Koblentz

Gregory D. Koblentz is an assistant professor in the Department of Public and International Affairs at George Mason University.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.