5,569
Views
4
CrossRef citations to date
0
Altmetric
Articles

Think-Aloud Interviews: A Tool for Exploring Student Statistical Reasoning

ORCID Icon, ORCID Icon, , ORCID Icon, ORCID Icon, ORCID Icon, , & show all
Pages 100-113 | Published online: 13 May 2022

References

  • Adams, W. K., and Wieman, C. E. (2011), “Development and Validation of Instruments to Measure Learning of Expert-Like Thinking,” International Journal of Science Education, 33, 1289–1312.
  • Bandalos, D. L. (2018), Measurement Theory and Applications for the Social Sciences, New York, NY: Guildford Press.
  • Blair, J., and Conrad, F. G. (2011), “Sample Size for Cognitive Interview Pretesting,” Public Opinion Quarterly, 75, 636–658. DOI: https://doi.org/10.1093/poq/nfr035.
  • Boels, L., Bakker, A., Van Dooren, W., and Drijvers, P. (2019), “Conceptual Difficulties When Interpreting Histograms: A Review,” Educational Research Review, 28, 100291. DOI: https://doi.org/10.1016/j.edurev.2019.100291.
  • Bowen, C. W. (1994), “Think-Aloud Methods in Chemistry Education: Understanding Student Thinking,” Journal of Chemical Education, 71, 184–190. DOI: https://doi.org/10.1021/ed071p184.
  • Branch, J. L. (2000), “Investigating the Information-Seeking Processes of Adolescents: The Value of Using Think Alouds and Think Afters,” Library & Information Science Research, 22, 371–392.
  • Castro Sotos, A. E., Vanhoof, S., Van den Noortgate, W., and Onghena, P. (2007), “Students’ Misconceptions of Statistical Inference: A Review of the Empirical Evidence from Research on Statistics Education,” Educational Research Review, 2, 98–113. DOI: https://doi.org/10.1016/j.edurev.2007.04.001.
  • Chance, B., delMas, R., and Garfield, J. (2004), “Reasoning About Sampling Distributions,” in The Challenge of Developing Statistical Literacy, Reasoning and Thinking, eds. D. Ben-Zvi and J. Garfield, pp. 295–323, Dordrecht: Kluwer Academic Publishers.
  • Cooper, L. L. (2018), “Assessing Students’ Understanding of Variability in Graphical Representations that Share the Common Attribute of Bars,” Journal of Statistics Education, 26, 110–124. DOI: https://doi.org/10.1080/10691898.2018.1473060.
  • Cooper, L. L., and Shore, F. S. (2008), “Students’ Misconceptions in Interpreting Center and Variability of Data Represented via Histograms and Stem-and-Leaf Plots,” Journal of Statistics Education, 16. DOI: https://doi.org/10.1080/10691898.2008.11889559.
  • Cummiskey, K., Adams, B., Pleuss, J., Turner, D., Clark, N., and Watts, K. (2020), “Causal Inference in Introductory Statistics Courses,” Journal of Statistics Education, 28, 2–8. DOI: https://doi.org/10.1080/10691898.2020.1713936.
  • Deane, T., Nomme, K., Jeffery, E., Pollock, C., and Birol, G. (2014), “Development of the Biological Experimental Design Concept Inventory (BEDCI),” CBE-Life Sciences Education, 13, 540–551.
  • Ericsson, K. A., and Simon, H. A. (1998), “How to Study Thinking in Everyday Life: Contrasting Think-Aloud Protocols With Descriptions and Explanations of Thinking,” Mind Culture, and Activity, 5, 178–186. DOI: https://doi.org/10.1207/s15327884mca0503_3.
  • Evans, C., Reinhart, A., Burckhardt, P., Nugent, R., and Weinberg, G. (2020), “Exploring How Students Reason About Correlation and Causation,” in Poster presented at: Electronic Conference On Teaching Statistics (eCOTS), Available at https://www.causeweb.org/cause/ecots/ecots20/posters/2-03.
  • Feldon, D. F. (2007), “The Implications of Research on Expertise for Curriculum and Pedagogy,” Educational Psychology Review, 19, 91–110. DOI: https://doi.org/10.1007/s10648-006-9009-0.
  • Fry, E. (2017), “Introductory Statistics Students’ Conceptual Understanding of Study Design and Conclusions,” PhD thesis, University of Minnesota.
  • GAISE College Report ASA Revision Committee. (2016), “Guidelines for Assessment and Instruction in Statistics Education College Report,” Available at https://www.amstat.org/education/guidelines-for-assessment-and-instruction-in-statistics-education-(gaise)-reports.
  • Garvin-Doxas, K., and Klymkowsky, M. W. (2008), “Understanding Randomness and its Impact on Student Learning: Lessons Learned from Building the Biology Concept Inventory (BCI),” CBE-Life Sciences Education, 7, 227–233.
  • Jorion, N., Gane, B. D., James, K., Schroeder, L., DiBello, L. V., and Pellegrino, J. W. (2015), “An Analytic Framework for Evaluating the Validity of Concept Inventory Claims,” Journal of Engineering Education, 104, 454–496. DOI: https://doi.org/10.1002/jee.20104.
  • Kaczmarczyk, L. C., Petrick, E. R., East, J. P., and Herman, G. L. (2010), “Identifying Student Misconceptions of Programming,” in Proceedings of the 41st ACM Technical Symposium on Computer Science Education, SIGCSE ’10, pp. 107–111, New York, NY, USA. Association for Computing Machinery.
  • Kaplan, J. J., Gabrosek, J. G., Curtiss, P., and Malone, C. (2014), “Investigating Student Understanding of Histograms,” Journal of Statistics Education, 22. DOI: https://doi.org/10.1080/10691898.2014.11889701.
  • Karpierz, K., and Wolfman, S. A. (2014), “Misconceptions and Concept Inventory Questions for Binary Search Trees and Hash Tables,” in Proceedings of the 45th ACM Technical Symposium on Computer Science Education, SIGCSE ’14, pp. 109–114, New York, NY, USA. Association for Computing Machinery.
  • Konold, C. (1989), “Informal Conceptions of Probability,” Cognition and Instruction, 6, 59–98. DOI: https://doi.org/10.1207/s1532690xci0601_3.
  • Lane-Getaz, S. J. (2007), “Development and Validation of a Research-based Assessments: Reasoning about p-values and Statistical Significance,” PhD thesis, University of Minnesota.
  • Leighton, J. P. (2013), “Item Difficulty and Interviewer Knowledge Effects on the Accuracy and Consistency of Examinee Response Processes in Verbal Reports,” Applied Measurement in Education, 26, 136–157. DOI: https://doi.org/10.1080/08957347.2013.765435.
  • Leighton, J. P. (2017), Using Think-Aloud Interviews and Cognitive Labs in Educational Research, New York: Oxford University Press.
  • Leighton, J. P. (2021), “Rethinking Think-Alouds: The Often-Problematic Collection of Response Process Data,” Applied Measurement in Education, 34, 61–74.
  • Lipson, K. (2002), “The Role of Computer Based Technology in Developing Understanding of the Concept of Sampling Distribution,” in Proceedings of the Sixth International Conference on Teaching Statistics.
  • Liu, P., and Li, L. (2015), “An Overview of Metacognitive Awareness and l2 Reading Strategies,” in The Routledge International Handbook of Research on Teaching Thinking, eds. R. Wegerif, L. Li, and J. C. Kaufman, pp. 290–303, New York: Routledge.
  • Lovett, M. (2001), “A Collaborative Convergence on Studying Reasoning Processes: A Case Study in Statistics,” in Cognition and Instruction: Twenty-five Years of Progress, eds. S. M. Carver and D. Klahr, pp. 347–384, MahWah, NJ; Lawrence Erlbaum Associates Publishers.
  • Lübke, K., Gehrke, M., Horst, J., and Szepannek, G. (2020), “Why We Should Teach Causal Inference: Examples in Linear Regression with Simulated Data,” Journal of Statistics Education, 28, 133–139. DOI: https://doi.org/10.1080/10691898.2020.1752859.
  • McGinness, L. P., and Savage, C. M. (2016), “Developing an Action Concept Inventory,” Physical Review Physics Education Research, 12, 010133. DOI: https://doi.org/10.1103/PhysRevPhysEducRes.12.010133.
  • Meyer, M., Orellana, J., and Reinhart, A. (2020), “Using Cognitive Task Analysis to Uncover Misconceptions in Statistical Inference Courses,” in Poster presented at: Electronic Conference On Teaching Statistics (eCOTS), Available at https://www.causeweb.org/cause/ecots/ecots20/posters/2-02.
  • Newman, D. L., Snyder, C. W., Fisk, J. N., and Wright, L. K. (2016), “Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool,” CBE-Life Sciences Education, 15.
  • Nielsen, J., and Landauer, T. K. (1993), “A Mathematical Model of the Finding of Usability Problems,” in Proceedings of the INTERACT ’93 and CHI ’93 Conference on Human Factors in Computing Systems, CHI ’93, pp. 206–213, New York, NY, USA. Association for Computing Machinery.
  • Noll, J., and Hancock, S. (2015), “Proper and Paradigmatic Metonymy as a Lens for Characterizing Student Conceptions of Distributions and Sampling,” Educational Studies in Mathematics, 88, 361–383. DOI: https://doi.org/10.1007/s10649-014-9547-1.
  • Nørgaard, M., and Hornbaek, K. (2006), “What do Usability Evaluators Do in Practice?: An Explorative Study of Think-Aloud Testing,” in Proceedings of the 6th Conference on Designing Interactive Systems, pp. 209–218.
  • Park, J. (2012), “Developing and Validating an Instrument to Measure College Students’ Inferential Reasoning in Statistics: An Argument-Based Approach to Validation,” PhD thesis, University of Minnesota.
  • Pfannkuch, M., Budgett, S., and Arnold, P. (2015), “Experiment-to-Causation Inference: Understanding Causality in a Probabilistic Setting,” in Reasoning about Uncertainty: Learning and Teaching Informal Inferential Reasoning, eds. A. Zieffler and E. Fry, pp. 95–127, Minneapolis, MN: Catalyst Press.
  • Porter, L., Zingaro, D., Liao, S. N., Taylor, C., Webb, K. C., Lee, C., and Clancy, M. (2019), “BDSI: A Validated Concept Inventory for Basic Data Structures,” in Proceedings of the 2019 ACM Conference on International Computing Education Research, ICER ’19, pp. 111–119, New York, NY, USA. Association for Computing Machinery.
  • Pressley, M., and Afflerbach, P. (1995), Verbal Protocols of Reading: The Nature of Constructively Responsive Reading, New York: Routledge.
  • Roberts, V. L., and Fels, D. I. (2006), “Methods for Inclusion: Employing Think Aloud Protocols in Software Usability Studies with Individuals who are Deaf,” International Journal of Human-Computer Studies, 64, 489–501. DOI: https://doi.org/10.1016/j.ijhcs.2005.11.001.
  • Sabbag, A. (2016), “Examining The Relationship Between Statistical Literacy and Statistical Reasoning,” PhD thesis, University of Minnesota.
  • Sawilowsky, S. S. (2004), “Teaching Random Assignment: Do You Believe it Works?” Journal of Modern Applied Statistical Methods, 3, 221–226. DOI: https://doi.org/10.22237/jmasm/1083370980.
  • Taylor, C., Clancy, M., Webb, K. C., Zingaro, D., Lee, C., and Porter, L. (2020), “The Practical Details of Building a CS Concept Inventory,” in Proceedings of the 51st ACM Technical Symposium on Computer Science Education, SIGCSE ’20, pp. 372–378, New York, NY, USA. Association for Computing Machinery.
  • Theobold, A. S. (2021), “Oral Exams: A More Meaningful Assessment of Students’ Understanding,” Journal of Statistics and Data Science Education, 29, 156–159. DOI: https://doi.org/10.1080/26939169.2021.1914527.
  • Williams, A. M. (1999), “Novice Students’ Conceptual Knowledge of Statistical Hypothesis Testing,” in Making the Difference: Proceedings of the Twenty-second Annual Conference of the Mathematics Education Research Group of Australasia, eds. J. M. Truran, and K. M. Truran, pp. 554–560, Adelaide, South Australia: MERGA.
  • Willis, G. B. (2005), Cognitive Interviewing, Thousand Oaks, CA: SAGE Publications.
  • Woodard, V., and Lee, H. (2021), “How Students use Statistical Computing in Problem Solving,” Journal of Statistics and Data Science Education, 29, S145–S156. DOI: https://doi.org/10.1080/10691898.2020.1847007.
  • Wren, D., and Barbera, J. (2013), “Gathering Evidence for Validity During the Design, Development, and Qualitative Evaluation of Thermochemistry Concept Inventory Items,” Journal of Chemical Education, 90, 1590–1601. DOI: https://doi.org/10.1021/ed400384g.
  • Ziegler, L. A. (2014), “Reconceptualizing Statistical Literacy: Developing an Assessment for the Modern Introductory Statistics Course,” PhD thesis, University of Minnesota.