ABSTRACT

Structured analytic techniques (SATs) help the intelligence community reduce flaws in cognition that lead to faulty reasoning. To ascertain whether SATs provide benefits to reasoning we conducted an experiment within a web-based application, comparing three conditions: 1) unaided reasoning, 2) a prototypical order-based SAT and 3) a flexible, process-based SAT that we call TRACE. Our findings suggest that the more flexible SAT generated higher quality reasoning compared to the other conditions. Consequently, techniques and training that support flexible analytical processes rather than those that require a set sequence of steps may be more beneficial to intelligence analysis and complex reasoning.  Keywords: structured analytical techniques, Analysis of Competing Hypotheses, tradecraft, cognitive biases, experiments.

Acknowledgements

The authors would like to thank Sarah Taylor and Roc Myers for their subject matter experience of intelligence analysis and structured analytic techniques. The experimentation software was developed by SRC., Inc. We wish to thank Deb Plochocki, Lou Nau, Andrew Whalen, Laura Simonetta, Ryan Conner, and the rest of the SRC team. We also acknowledge the contributions of several additional team members including: Yatish Hegde, Niraj Sitaula, Sarah Bolden, Erin Bartolo, Jerry Robinson, Jun Fang, Priya Harindranathan, Marcia Morales, Thomas Gallegos, Paige Odegard, Gregory Russel, Rhema Zlaten, Cayla Dorsey, Sophie Estep, Audrey Lew, Quincy Nolan, Sweeney Pandit, and William Wang. This research was supported by the Intelligence Advanced Research Projects Activity (IARPA) via the Department of Interior National Business Center contract number 2017-16121900004. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright annotation thereon. Disclaimer: The views and conclusions expressed herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of IARPA, DoI/NBC, or the U.S. Government.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1. Gilovich, Griffin, and Kahneman, Heuristics and Biases.

2. Gilhooly, “Working Memory and Reasoning.”

3. Heuer, Psychology of Intelligence Analysis.

4. Heuer, and Pherson, Structured Analytic Techniques for Intelligence Analysis.

5. See note 3 above.

6. Artner, Girven, and Bruce, “Assessing the Value of Structured Analytic Techniques”; Jones, “Critical Epistemology for Analysis of Competing Hypotheses”; and Wastell, “Cognitive Predispositions and Intelligence Analyst Reasoning.”

7. Jones, “Critical Epistemology for Analysis of Competing Hypotheses.”

8. Chang, Berdini, Mandel, and Tetlock, “Restructuring Structured Analytic Techniques”; Coulthart, “An Evidence-Based Evaluation of 12 Core Structured Analytic Techniques”; Whitesmith, “The efficacy of ACH in mitigating serial position effects and confirmation bias in an intelligence analysis scenario”; and Mandel, Karvetski, and Dhami, “Boosting intelligence analysts’ judgment accuracy: What works, what fails?”

9. Wastell, “Cognitive Predispositions and Intelligence Analyst Reasoning”; and Artner, Girven, and Bruce, “Assessing the Value of Structured Analytic Techniques.”

10. Heuer, Psychology of Intelligence Analysis; and Heuer, and Pherson, Structured Analytic Techniques for Intelligence Analysis.

11. We relied on convenience sampling to recruit and interview members of the intelligence community. We interviewed 9 current and former analysts with a range of experiences. The majority of our interviewees have a background in the Air Force, but we also interviewed analysts with experiences in a number of other Department of Defense and national-level intelligence agencies. The goal for these interviews was not to provide generalizable findings based on a representative sample of IC analysts, but to gather initial design ideas and feedback for us to keep in mind while developing early versions of TRACE. Each interview followed a semi-structured protocol asking the interviewee to walk us through how they would answer a fictional question. The fictional question was meant to mimic a question that intelligence analysts encounter – determining the motivation underpinning a foreign leader’s recent actions, and we included source materials to help the interviewees answer the question. During the interview, we asked follow-up questions probing for details, including the interviewees’ analytical process, structured analytic techniques, collaboration, the composition and reception of analytic products, the key challenges that analysts in the IC face, and IC culture.

12. Moon, and Hoffman, “How Might “Transformational” Technologies and Concepts Be Barriers.”

13. Robb, Silberman, Levin, McCain, Rowen, Slocombe, Studeman, The Commission on the Intelligence Capabilities.

14. Ibid., 408.

15. Duelfer, Report on the Commission on the Intelligence Capabilities

16. Carrington, Chen, Davies, Kaur, and Neville, “The Effectiveness of a Single Intervention”; Hitchcock, “The Effectiveness of Computer-Assisted Instruction”; Kunsch, Schnarr, and van Tyle, “The Use of Argument Mapping”; and Twardy, “Argument Maps Improve Critical Thinking.”

17. Rowe, Macagno, Reed, and Walton, “Araucaria as a Tool”; and Gelder, “Cultivating Deliberation for Democracy.”

18. Billman, Convertino, Shrager, Pirolli, and J Massar, “Collaborative Intelligence Analysis”; Gustavi, Karasalo, and Mårtenson, “A Tool for Generating”; and Wilson, Brown, and Biddle, “ACH Walkthrough: A Distributed Multi-Device Tool.”

19. Good, Shrager, Stefik, Pirolli, Card, and Heuer, “ACH1.1: A Tool for Analyzing.”

21. Artner, Girven, and Bruce, “Assessing the Value of Structured Analytic Techniques.” Artner et al. reviewed twenty-nine Central Intelligence Agency analytic products posted on the World Intelligence Review during a two-week period in July 2014, fourteen National Intelligence Council analytic products posted on Intelink in July 2014, and a random sample of twenty Defense Intelligence Agency and Central Intelligence Agency products published in 2013 on four specific issues. The authors do not specify the four issues they chose to examine in their random sample.

22. Coulthart, “Why Do Analysts Use Structured Analytic Techniques? An In-depth Study of an American Intelligence Agency.”

23. Moon, and Hoffman, “How Might “Transformational” Technologies and Concepts Be Barriers”; and Wastell, “Cognitive Predispositions and Intelligence Analyst Reasoning.”

24. Wastell, ibid.

25. See note 22 above.

26. Immerman, “Transforming Analysis: The Intelligence Community’s Best Kept Secret”; Fingar, “Analysis in the US Intelligence Community: Missions, Masters, and Methods.”

27. See note 22 above.

28. Johnston, “Analytic Culture in the US Intelligence Community: An Ethnographic Study.”

29. See note 22 above.

30. Marrin, “Is Intelligence Analysis an Art or Science?”

31. Dhami, Mandel, Mellers, and Tetlock, “Improving Intelligence Analysis with Decision Science”; and Mandel, and Tetlock, “Correcting Judgment Correctives in National Security Intelligence.”

32. Artner, Girven, and Bruce, “Assessing the Value of Structured Analytic Techniques”; and Coulthart, “An Evidence-Based Evaluation of 12 Core Structured Analytic Techniques.”

33. Chang, Berdini, Mandel, and Tetlock, “Restructuring Structured Analytic Techniques.”

34. See note 13 above.

35. See note 3 above.

36. Ibid.

37. Ibid., 108.

38. Jones, “Critical Epistemology for Analysis of Competing Hypotheses”; and Chang, Berdini, Mandel, and Tetlock, “Restructuring Structured Analytic Techniques.”

39. See note 7 above.

40. See note 31 above.

41. See note 36 above.

42. See note 8 above.

43. Lehner, Adelman, Cheikes, and Brown, “Confirmation Bias in Complex Analyses.”

44. Nickerson, “Confirmation Bias: A Ubiquitous Phenomenon.”

45. Kretz, Simpson, and Graham, “A Game-Based Experimental Protocol.”

46. Convertino, Billman, Pirolli, Masser, and Shrager, “The CACHE Study.”

47. Ibid.

48. Ibid.

49. Whitesmith, “The Efficacy of ACH in Mitigating Serial Position Effects and Confirmation Bias in an Intelligence Analysis Scenario.”

50. Mandel, Karvetski, and Dhami, “Boosting Intelligence Analysts’ Judgment Accuracy: What Works, What Fails?”

51. Dhami, Belton, and Mandel, “The “Analysis of Competing Hypotheses” in Intelligence Analysis.”

52. Ibid.

53. Heuer (8 December 2009) noted in a lecture he gave to the National Academies of Science that measuring judgment accuracy as the criterion to determine the efficacy of ACH is problematic. Accuracy is conditional based on current knowledge and circumstances necessarily will change accuracy. Moreover, accuracy is necessarily probabilistic, and so determining what is accurate in a probability estimate is challenging. He argues that assessing ACH and other SATS based on quality of analysis is a better metric to determine efficacy.

54. Bradley, and Vetch, “Supporting Annotation as a Scholarly Tool”; Glover, Xu, and Hardaker, “Online Annotation – Research and Practices”; Gomez, Gomez, Cooper, Lozano, and Mancevice, “Redressing Science Learning through Supporting Language”; Herman, Perkins, Hansen, Gomez, and Gomez, “The Effectiveness of Reading Comprehension Strategies”; Simpson, and Nist, “Textbook Annotation: An Effective and Efficient Study Strategy”; and Zywica, and Gomez, “Annotating to Support Learning.”

55. Boyd, Baliko, and Polyakova-Norwood, “Using Debates to Teach”; Freeley, and Steinberg, Argumentation and Debate; and Mercier, and Landemore, “Reasoning Is for Arguing.”

56. See note 4 above.

57. Thaler, and Sunstein. Nudge: Improving Decisions about Health.

58. Hutchins, Distributed cognition

59. The Open Science Pre-Registration can be found here: https://osf.io/2u9nh. All experimental materials, including the case problems and the evaluation approach (codebook and case keys) are available upon request to the first author.

60. Gonzales, and Cunningham, “The Promise of Pre-Registration in Psychological Research.”

61. We used MTurk’s customized qualifications to block participants that participated in previous studies and user testing of TRACE. Participants must have resided in the U.S. and have a minimum of a bachelor’s degree. Based on our analyses there were no other systematic demographic differences or other factors that set them apart from the general population.

62. We used the R package pwr to calculate the statistical power52 we would need to ascertain discernable differences between conditions and cross-checked the results using the software G*power. See: Champely, Ekstrom, Dalgaard, Gill, Weibelzahl, Anandkumar, Ford, “Package “Pwr.””

63. Neuendorf, The Content Analysis Guidebook; and Krippendorff, Content Analysis: An Introduction.

64. Brooke, “SUS- A “quick and Dirty” Usability Scale.”

65. After data collection, we identified that a few respondents were able to add comments in the source material, a PDF hosted in Google Docs, which changed the nature of the external control. This problem affected 36 cases in the Cambria Escape Route external control condition and 4 cases in the Unusual Subjects condition.

66. Artner, Girven, and Bruce, “Assessing the Value of Structured Analytic Techniques”; Chang, Berdini, Mandel, and Tetlock, “Restructuring Structured Analytic Techniques”; and Dhami, Mandel, Mellers, and Tetlock, “Improving Intelligence Analysis with Decision Science.”

67. We thank a reviewer for pointing out that Richards Heuer and Randolph Pherson’s work on SATs was greatly influenced by prior work by Jack Davis and Roger George on “alternative analysis” techniques. For summaries, see: George, “Fixing the Problem of Analytical Mind-Sets: Alternative Analysis”; and Davis, “Improving CIA Analytic Performance: Strategic Warning.”

Additional information

Notes on contributors

Jennifer Stromer-Galley

Jennifer Stromer-Galley is Professor and Senior Associate Dean for Academic and Faculty Affairs in the School of Information Studies at Syracuse University. Jenny has published over 50 journal articles, proceedings, and book chapters in the disciplines of communication, political science, information science, and computer science. Her book, Presidential Campaigning in the Internet Age (Oxford University Press), provides a history of presidential campaigns as they have adopted and adapted to digital communication technologies.

Patricia Rossini

Patricia Rossini is an inaugural Derby Fellow in the Department of Communication and Media at the University of Liverpool, UK. Her research in online political talk, uncivil and intolerant discourse, and misinformation has been funded by social media platforms such as WhatsApp, Twitter, and Facebook, and by the Knight Foundation. Some of her recent work has been published by Communication Research, New Media & Society, Political Studies, Social Media+Society, the International Journal of Communication, and the Journal of Information, Technology & Politics.

Kate Kenski

Kate Kenski is a Professor in the Department of Communication at the University of Arizona. She is co-author of The Obama Victory: How Media, Money, and Message Shaped the 2008 Election with Bruce W. Hardy and Kathleen Hall Jamieson (2010, Oxford University Press) and Capturing Campaign Dynamics: The National Annenberg Election Survey with Daniel Romer, Paul Waldman, Christopher Adasiewicz, and Kathleen Hall Jamieson (2004, Oxford University Press). Dr. Kenski has published over 70 book chapters, articles, and research notes in publications such as the Annals of the American Academy of Political and Social Science, Communication Research, the International Journal of Public Opinion Research, the Journal of Applied Social Psychology, and Public Opinion Quarterly. Her current research focuses on social media and incivility, gender and politics, presidential campaigns, and online platform development that aids users’ quality of reasoning.

Brian McKernan

Brian McKernan is a research assistant professor in the School of Information Studies at Syracuse University. He holds a Ph.D. in Sociology from the University at Albany, SUNY. Dr. McKernan’s research interests include the design and study of serious games and other innovative software applications to strengthen human reasoning and address other pressing social issues. Dr. McKernan has published several peer-reviewed journal articles, conference proceedings, and edited chapters on topics pertaining to technology, culture, and reasoning.

Benjamin Clegg

Benjamin Clegg is a Professor of Cognitive Psychology at Colorado State University. He received his Ph.D. in Psychology from the University of Oregon. Dr. Clegg’s research in cognitive psychology, human factors, and engineering psychology encompasses a variety of aspects of human performance, including the impact of automation, and work on enhancing decision making.

James Folkestad

James Folkestad is a Professor in the School of Education, the Director of the Center for the Analytics of Learning & Teaching, and holds the position of University Distinguished Teaching Scholar at CSU. His research expertise includes technology-enhanced learning, analytics for learning and teaching, and dyslexia and innovation. James has worked on and managed several large innovation-related grants that resulted in two patented learning games, a team-based collaboration environment, and a learning platform for sharing alternative digital content to support learning at CSU. Most recently he has been instrumental in developing U-BehaviorTM a process that reimagines the Canvas (LMS) quizzing system to support students’ understanding of their learning behaviors and understanding of their cognitive processing.

Carsten Østerlund

Carsten Østerlund is a Professor at the School of Information Studies at Syracuse University. His research explores the organization, creation, and use of documents and other sociomaterial practices in distributed work environments, with an emphasis on learning and knowledge dynamics in new forms of work. Empirically he studies these issues through in-depth qualitative and quantitative studies of everyday work practices in a range of settings including citizen science, crowdsourcing, distributed science teams, game design and healthcare. Methodologically, he focuses on how we can merge qualitative and quantitative techniques to study trace data.

Lael Schooler

Lael Schooler is a Professor of Psychology at Syracuse University. He received his Ph.D. in Cognitive Psychology from Carnegie Mellon University in 1993. Prior to coming to Syracuse University in 2014, he was Senior Research Scientist at the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development in Berlin. Lael investigates simple heuristics— decision strategies that use limited information to make effective decisions in an uncertain world. Much of his work is informed by the ACT-R theory of cognition that supports the development of computer simulations of human behavior and the neural correlates of that behavior.

Olga Boichak

Olga Boichak is a Lecturer in Digital Cultures at the University of Sydney, Australia. She is a member of the Centre for International Security Studies and has a track record of publications on digital war, legitimising state power, transnational mobilisation, and algorithmic surveillance. She holds a Master’s of Public Administration and a doctorate in Social Science from Syracuse University, and her research interests span networks, discourses, and cultures of activism in the digital age. Her work has appeared in International Journal of Communication, Big Data & Society, and Media, War & Conflict.

Jordan Canzonetta

Jordan Canzonetta is an Assistant Professor of English Studies at Lewis University in Romeoville, Illinois. She earned her Ph.D. in Composition and Cultural Rhetoric from Syracuse University. Her work has appeared in the Journal of Writing Assessment and in international and national edited collections on digital rhetorics and plagiarism detection technologies. Her expertise is squarely located in rhetorical studies related to the digital sphere, AI and automation, technical communication, academic labor, and writing assessment in higher education.

Rosa Mikeal Martey

Rosa Mikeal Martey is Professor in the Department of Journalism and Media Communication at Colorado State University. Dr. Martey’s research focuses on identity and social interaction in online contexts, from virtual worlds and multiplayer games (MMOs), to Facebook and online information-seeking, including for employment and health information. She focuses on ways that visual interface, user behavior, and self-perceptions influence understanding, use, and impact of such spaces.

Corey Pavlich

Corey Pavlich is a Social Scientist assisting Marines and their families. He earned his Ph.D. from the University of Arizona in 2018. His research interests include: nonverbal behavior, deception, interpersonal communication, social influence, and aversive stimulation. His work has been published in journals including: Communication Monographs, Communication Research, Human Communication Research, and The International Journal of Intelligence and CounterIntelligence.

Eric Tsetsi

Eric Tsetsi is a lecturer of political communication and journalism at the University of Amsterdam. He completed his Ph.D. in communication at The University of Arizona in 2020. His research interests include communication technologies, political communication, and media effects. His work has been published in journals including Mobile Media & Communication, Communication Monographs, Communication Research, and Human Communication Research.

Nancy McCracken

Nancy McCracken is a Research Consultant in the Center for Computational and Data Sciences in the iSchool. She holds a Ph.D. in Computer Science from Syracuse University and is a retired Associate Research Professor in the iSchool. Dr. McCracken’s research focuses on applications of computational linguistics in analyzing information from natural language text. Recent projects include the use of natural language processing and active machine learning to assist social scientists in content analysis of text and studies of the use of language in social media for negative sentiments and for topics in political campaigns.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 322.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.