633
Views
0
CrossRef citations to date
0
Altmetric
Special Section: Psychology of Intelligence

Revisiting the Psychology of Structured Analytical Techniques

Pages 634-648 | Published online: 01 Sep 2023
 

Abstract

Structured analytic techniques (SATs) are considered the gold standard for intelligence practitioners to mitigate judgmental biases. On the other hand, recent psychological research has challenged the effectiveness of SATs. Here, these seemingly irreconcilable standpoints will, to some extent, be reunited. To this end, the empirical evidence of three prominent SATs is reviewed: empirical research provides support for brainstorming and devil’s advocacy. The analysis of competing hypotheses, however, does not have empirical support. Based on the literature review and conceptual criticism of SATs, implications are discussed: the Intelligence Community (IC) must discard its antiempiricism. To this end, intelligence studies programs should regularly incorporate quantitative research methods courses in their curricula. The research community, on the other hand, should seek more frequent collaboration with the IC, to support the IC in improving the empirical foundations of their work. This could improve judgmental processes in intelligence analysis as well—possibly both within and beyond SATs. Moreover, other tasks in the IC (e.g., human intelligence) would benefit from a sound empirical approach as well.

This article is related to:
The Psychology of Intelligence

DECLARATION OF INTEREST

The author reports there are no competing interests to declare.

Notes

1 Richards J. Heuer and Randolph H. Pherson, Structured Analytic Techniques for Intelligence Analysis (3rd ed.) (Washington, DC: CQ Press, 2020).

2 United States, Intelligence Reform and Terrorism Prevention Act (Washington, DC: U.S. Congress, 2004). https://www.govinfo.gov/app/details/PLAW-108publ458

3 Office of the Director of National Intelligence, Analytic Standards, ICD 203 (Washington, DC: ODNI, 2015), https://irp.fas.org/dni/icd/icd-203.pdf

4 David R. Mandel, “Intelligence, Science and the Ignorance Hypothesis,” in The Academic-Practitioner Divide in Intelligence Studies, edited by Rubén Arcos Martín, Nicole K. Drumhiller, and Mark Phythian. Security and Professional Intelligence Education Series (SPIES) 35 (Lanham, MD: Rowman & Littlefield, 2022).

5 Randolph H. Pherson, Handbook of Analytic Tools and Techniques (5th ed.) (Tysons, VA: Pherson Associates, LLC, 2018), p. 5.

6 Pherson distinguishes between cognitive biases, misapplied heuristics, and intuitive traps. In this article, however, the term bias is used in a broad sense along a psychological definition: “The systematic and predictable mistakes that influence the judgment of even very talented human beings,” in Max. H. Baezerman, “Judgment and Decision Making,” in Noba Textbook Series: Psychology, edited by Ed Diener and Robert Biswas-Diener (Champaign: DEF publishers, 2022). http://noba.to/9xjyvc3a

7 Norbert Schwarz, Madeline Jalbert, Tom Noah, and Lynn Zhang, “Metacognitive Experiences as Information: Processing Fluency in Consumer Judgment and Decision Making,” Consumer Psychology Review, Vol. 4, No. 1 (2021), pp. 4–25. https://doi.org/10.1002/arcp.1067; Norbert Schwarz and Gerald L. Clore, “Mood as Information: 20 Years Later,” Psychological Inquiry 14 (2003): 296–303, https://doi.org/10.1207/S15327965PLI1403&4_20.

8 Notably, a prominent psychological model for explaining the anchoring heuristic, the Selective-Accessibility-Model by Mussweiler and Strack, assumes that the selective accessibility of anchor-congruent information causes the anchoring effect. That means that information that is accessible and can easily be used is driving the anchor effect. This corresponds to the notion of “selecting the first answer that fits” in Pherson’s justification for the exploration techniques (Handbook of Analytic Tools and Techniques, 7); Thomas Mussweiler and Fritz Strack, “Comparing Is Believing: A Selective Accessibility Model of Judgmental Anchoring,” European Review of Social Psychology, Vol. 10, No. 1 (1999), pp. 135–167. https://doi.org/10.1080/14792779943000044

9 Pherson, Handbook of Analytic Tools and Techniques.

10 Stephen Coulthart, “Why Do Analysts Use Structured Analytic Techniques? An In-Depth Study of an American Intelligence Agency,” Intelligence and National Security, Vol. 31, No. 7 (2016), pp. 933–948. https://doi.org/10.1080/02684527.2016.1140327

11 Stephen Artner, Richard Girven, and James Bruce, Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community (RAND Corporation, 2016), https://doi.org/10.7249/RR1408. Here are the exact figures from Artner et al.: From the CIA products eight out of twenty (40%), from the NIC four out of fourteen (29%), and from the DIA/CIA six out of 29 (21%) mentioned using a SAT. As the authors note, it is unlikely that a SAT was used at an earlier stage of the analytic process but was not mentioned in the final product.

12 Coulthart, “Why Do Analysts Use Structured Analytic Techniques?”

13 Ibid.

14 Mandel, “Intelligence, Science and the Ignorance Hypothesis.”

15 U.S. Government, “A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis” (2009), https://www.cia.gov/static/955180a45afe3f5013772c313b16face/Tradecraft-Primer-apr09.pdf

16 Heuer and Pherson, Structured Analytic Techniques for Intelligence Analysis.

17 Pherson, Handbook of Analytic Tools and Techniques.

18 Ibid.

19 The comparison to nominal groups is of practical importance. It is trivial to show that, for example, five people in a group generate more ideas than one person alone. To compare the efficiency of brainstorming one needs to keep the number of people constant; thus, the performance of five people in a group is compared to the performance of five people working alone (nominal groups). Of course, for nominal groups, the redundant answers are removed. Thus, the better performance in nominal groups is not an artefact of redundancy.

20 Paul B. Paulus, Vicky L. Putman, Karen Leggett Dugosh, Mary T. Dzindolet, and Hamit Coskun, “Social and Cognitive Influences in Group Brainstorming: Predicting Production Gains and Losses,” in European Review of Social Psychology, edited by Wolfgang Stroebe and Miles Hewstone (Chichester, UK: John Wiley & Sons, 2005), pp. 299–325. https://doi.org/10.1002/0470013478.ch10. Paulus and colleagues provide an overview of research on brainstorming.

21 Pherson, Handbook of Analytic Tools and Techniques.

22 Michael Diehl and Wolfgang Stroebe, “Productivity Loss in Brainstorming Groups: Toward the Solution of a Riddle,” Journal of Personality and Social Psychology, Vol. 53 (1987), pp. 497–509. https://doi.org/10.1037/0022-3514.53.3.497; William H. Cooper, Brent R. Gallupe, Sandra Pollard, and Jana Cadsby, “Some Liberating Effects of Anonymous Electronic Brainstorming,” Small Group Research, Vol. 29, No. 2 (1998), pp. 147–178. https://doi.org/10.1177/1046496498292001

23 Diehl and Stroebe, “Productivity Loss in Brainstorming Groups.”

24 Paulus et al., “Social and Cognitive Influences in Group Brainstorming.”

25 Paul B. Paulus and Vincent R. Brown, “Enhancing Ideational Creativity in Groups: Lessons from Research on Brainstorming,” in Group Creativity: Innovation through Collaboration (New York: Oxford University Press, 2003), pp. 110–136. https://doi.org/10.1093/acprof:oso/9780195147308.003.0006?

26 Paul B. Paulus and Huei-Chuan Yang, “Idea Generation in Groups: A Basis for Creativity in Organizations,” Organizational Behavior and Human Decision Processes, Vol. 82, No. 1 (2000), pp. 76–87. https://doi.org/10.1006/obhd.2000.2888

27 Karen Leggett Dugosh, Paul B. Paulus, Evelyn J. Roland, and Huei-Chuan Yang, “Cognitive Stimulation in Brainstorming,” Journal of Personality and Social Psychology, Vol. 79 (2000), pp. 722–735. https://doi.org/10.1037/0022-3514.79.5.722

28 Paul B. Paulus, Toshihiko Nakui, Vicky L. Putman, and Vincent R. Brown, “Effects of Task Instructions and Brief Breaks on Brainstorming,” Group Dynamics: Theory, Research, and Practice, Vol. 10, No. 3 (2006), pp. 206–219. https://doi.org/10.1037/1089-2699.10.3.206

29 Stephen J. Coulthart, “An Evidence-Based Evaluation of 12 Core Structured Analytic Techniques,” International Journal of Intelligence and CounterIntelligence, Vol. 30, No. 2 (2017), pp. 368–391. https://doi.org/10.1080/08850607.2016.1230706

30 Pherson, Handbook of Analytic Tools and Techniques.

31 Diehl and Stroebe, “Productivity Loss in Brainstorming Groups.”

32 Pherson, Handbook of Analytic Tools and Techniques.

33 U.S. Government, “A Tradecraft Primer.”

34 Charles R. Schwenk and Richard A. Cosier, “Effects of the Expert, Devil’s Advocate, and Dialectical Inquiry Methods on Prediction Performance,” Organizational Behavior and Human Performance, Vol. 26, No. 3 (1980), pp. 409–424. https://doi.org/10.1016/0030-5073(80)90076-8; Charles R. Schwenk and Richard A. Cosier, “Effects of Consensus and Devil’s Advocacy on Strategic Decision-Making,” Journal of Applied Social Psychology, Vol. 23 (1993), pp. 126–139. https://doi.org/10.1111/j.1559-1816.1993.tb01056.x; Tobias Greitemeyer, Stefan Schulz-Hardt, Felix C Brodbeck, and Dieter Frey, “Information Sampling and Group Decision Making: The Effects of an Advocacy Decision Procedure and Task Experience,” Journal of Experimental Psychology: Applied, Vol. 12 (2006), pp. 31–42. https://doi.org/10.1037/1076-898X.12.1.31

35 Schwenk and Cosier, “Effects of the Expert, Devil’s Advocate, and Dialectical Inquiry Methods on Prediction Performance.”

36 U.S. Government, “A Tradecraft Primer.”

37 Pherson, Handbook of Analytic Tools and Techniques.

38 Schwenk and Cosier, “Effects of the Expert, Devil’s Advocate, and Dialectical Inquiry Methods on Prediction Performance.”

39 Ibid.

40 Brian D. Waddell, Michael A. Roberto, and Sukki Yoon, “Uncovering Hidden Profiles: Advocacy in Team Decision Making,” Management Decision, Vol. 51, No. 2 (013), pp. 321–340. https://doi.org/10.1108/00251741311301849

41 David R. Mandel and Philip E. Tetlock, “Correcting Judgment Correctives in National Security Intelligence,” Frontiers in Psychology, Vol. 9 (21 December 2018). https://doi.org/10.3389/fpsyg.2018.02640

42 Richard J. Heuer, The Psychology of Intelligence Analysis (Military Bookshop, 2010).

43 Coulthart, “An Evidence-Based Evaluation of 12 Core Structured Analytic Techniques.”

44 Paul Edward Lehner, Leonard Adelman, Brant A. Cheikes, and Mark J. Brown, “Confirmation Bias in Complex Analyses,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, Vol. 38, No. 3 (2008), pp. 584–592. https://doi.org/10.1109/TSMCA.2008.918634; Christopher W. Karvetski and David R. Mandel, “Coherence of Probability Judgments from Uncertain Evidence: Does ACH Help?,” Judgment and Decision Making, Vol. 15, No. 6 (2020), pp. 939–958. https://doi.org/10.31234/osf.io/7xg8j; Martha Whitesmith, “The Efficacy of ACH in Mitigating Serial Position Effects and Confirmation Bias in an Intelligence Analysis Scenario,” Intelligence and National Security, Vol. 34, No. 2 (2019), 225–242. https://doi.org/10.1080/02684527.2018.1534640; Mandeep K. Dhami, Ian K. Belton, and David R. Mandel, “The ‘Analysis of Competing Hypotheses’ in Intelligence Analysis,” Applied Cognitive Psychology, Vol. 33, No. 6 (2019), pp. 1080–1090. https://doi.org/10.1002/acp.3550

45 Lehner et al., “Confirmation Bias in Complex Analyses.”

46 Dhami, Belton, and Mandel, “The ‘Analysis of Competing Hypotheses’ in Intelligence Analysis.”

47 Whitesmith, “The Efficacy of ACH in Mitigating Serial Position Effects and Confirmation Bias in an Intelligence Analysis Scenario.”

48 David R. Mandel and Alan Barnes, “Geopolitical Forecasting Skill in Strategic Intelligence: Geopolitical Forecasting Skill,” Journal of Behavioral Decision Making, Vol. 31, No. 1 (2018), pp. 127–137. https://doi.org/10.1002/bdm.2055

49 David R. Mandel and Alan Barnes, “Accuracy of Forecasts in Strategic Intelligence,” Proceedings of the National Academy of Sciences, Vol. 111, No. 30 (2014), pp. 10984–10989. https://doi.org/10.1073/pnas.1406138111

50 Ibid.; Mandel and Barnes, “Geopolitical Forecasting Skill in Strategic Intelligence”; Welton Chang et al., “Restructuring Structured Analytic Techniques in Intelligence,” Intelligence and National Security 33, no. 3 (April 16, 2018): 337–56, https://doi.org/10.1080/02684527.2017.1400230

51 Rose McDermott, “Experimental Intelligence,” Intelligence and National Security, Vol. 26, No. 1 (2011), pp. 82–98. https://doi.org/10.1080/02684527.2011.556361

52 Mark M. Lowenthal and Ronald A. Marks, “Intelligence Analysis: Is It As Good As It Gets?,” International Journal of Intelligence and CounterIntelligence, Vol. 28, No. 4 (2015), pp. 662–665. https://doi.org/10.1080/08850607.2015.1051410

53 Steven Rieber and Neil Thomason, “Creation of a National Institute for Analytic Methods” Studies in Intelligence, Vol. 49, No. 4 (2005), pp. 71–77.

54 Office of the Director of National Intelligence, “ACE Aggregative Contingent Estimation,” Intelligence Advanced Research Projects Activity. https://www.iarpa.gov/research-programs/ace (accessed 10 February 2023).

55 Markus Denzler and L.K., “Calibration Feedback Improves Forecast Accuracy.” Unpublished data set (2022). More information can be obtained from the first author.

56 Maria Hartwig, Christian A. Meissner, and Matthew D. Semel, “Human Intelligence Interviewing and Interrogation: Assessing the Challenges of Developing an Ethical, Evidence-Based Approach,” in Investigative Interviewing, edited by Ray Bull (New York: Springer New York, 2014), pp. 209–228. https://doi.org/10.1007/978-1-4614-9642-7_11

Additional information

Notes on contributors

Markus Denzler

Markus Denzler is Professor of Psychology at the Federal University of Applied Administrative Sciences in Berlin. He obtained his habilitation from Technical University Chemnitz and his Ph.D. from Jacobs University Bremen. His research interests include the psychological processes in intelligence analysis and error culture in intelligence agencies. The author can be contacted at [email protected].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 214.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.