2,235
Views
13
CrossRef citations to date
0
Altmetric
Original Articles

History Assessments of Thinking: A Validity Study

ORCID Icon, ORCID Icon & ORCID Icon
Pages 118-144 | Received 28 Sep 2017, Accepted 27 Jun 2018, Published online: 10 Sep 2018

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
  • Anderson, J. (2004). Cognitive psychology and its implications (6th ed.). New York: Worth.
  • Andrews, T., & Burke, F. (2007). What does it mean to think historically? Perspectives. Retrieved from: http://www.historians.org/publications-and-directories/perspectives-on-history/january-2007/what-does-it-mean-to-think-hisorically
  • Appleby, J., Hunt, L., & Jacob, M. (1994). Telling the truth about history. New York: W. W. Norton & Company.
  • Bain, R. (2006). Rounding up the usual suspects: Facing the authority hidden in the history classroom. Teachers College Record, 108, 2080–2114. doi:10.1111/j.1467-9620.2006.00775.x
  • Bain, R. B., & Shreiner, T. L. (2005). Issues and options in creating a national assessment in world history. History Teacher, 38, 241–271. doi:10.2307/1555722
  • Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 80, 139–148. doi:10.1177/003172171009200119
  • Booth, M. (1993). Students’ historical thinking and the national curriculum in England. Theory and Research in Social Education, 21, 105–127. doi:10.1080/00933104.1993.10505695
  • Brante, E. W., & Strømsø, H. I. (2017). Sourcing in text comprehension: A review of interventions targeting sourcing skills. Educational Psychology Review, 38, 1–27. doi:10.1007/s10648-017-9421-7
  • Breakstone, J. (2014). Try, try, try again: The process of designing new history assessments. Theory & Research in Social Education, 42, 453–485. doi:10.1080/00933104.2014.965860
  • Breland, H. M., Danos, D. O., Kahn, H. D., Kubota, M. Y., & Bonner, M. W. (1994). Performance versus objective testing and gender: An exploratory study of an Advanced Placement history examination. Journal of Educational Measurement, 31, 275–293. doi:10.1111/j.1745-3984.1994.tb00447.x
  • Brown, T. (2009). Change by design: How design thinking transforms organizations and inspires innovation. New York: Harper Collins.
  • California Department of Education. (1998). History-social science content standards for California public schools. Sacramento, CA: California Department of Education.
  • Carr, E. H. (1961). What is history? New York: Vintage.
  • Chi, M., Glaser, R., & Farr, M. (1988). The nature of expertise. Hillsdale, NJ: Erlbaum.
  • College Board. (2014). AP United States history: Course and exam description including the curriculum framework effective fall 2014. New York: College Board.
  • College Board. (2017). AP United States history: 2017 free-response questions. New York: College Board. Retrieved from https://secure-media.collegeboard.org/ap/pdf/ap-us-history-frq-2017.pdf
  • Collingwood, R. G. (1946/1993). The idea of history: Revised edition with lectures 1926-1928. Oxford: Clarendon.
  • Cronbach, L. (1988). Five perspectives on validity argument. In H. Wainer & H. Braun (Eds.), Test validity (pp. 3–17). Hillsdale, NJ: Erlbaum.
  • Danto, A. C. (1965). Analytical philosophy of history. Cambridge, England: University Press.
  • Davidson, C. N. (2011). Now you see it: How the brain science of attention will transform the way we live, work, and learn. New York: Viking.
  • De La Paz, S., Felton, M., Monte-Sano, C., Croninger, B., Jackson, C., Deogracias, J. S., & Hoffman, B.P. (2014). Developing historical reading and writing: Relationships among professional development, fidelity of implementation, and student learning. Theory and Research in Social Education, 42, 228–274. doi:10.1080/00933104.2014.908754
  • Dickinson, A., & Lee, P. J. (1984). Making sense of history. In A. Dickinson, P. J. Lee, & P. Rogers (Eds.), Learning history (pp. 117–153). London: Heinemann.
  • Eliasson, P., Alvén, F., Yngvéus, C. A., & Rosenlund, D. (2015). Historical consciousness and historical thinking reflected in large-scale assessment in Sweden. In K. Ercikan & P. Seixas (Eds.), New directions in assessing historical thinking (pp. 171–182). New York, Routledge.
  • Ercikan, K., & Seixas, P. (2015). New directions in assessing historical thinking. New York: Routledge.
  • Ercikan, K., Seixas, P., Lyons-Thomas, J., & Gibson, L. (2015). Cognitive validity evidence for validating assessments of historical thinking. In K. Ercikan & P. Seixas (Eds.), New directions in assessing historical thinking (pp. 206–220). New York: Routledge.
  • Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data. Cambridge, MA: MIT Press.
  • Ericsson, K., & Smith, J. (1991). Toward a general theory of expertise: Prospects and limits. New York: Cambridge Press.
  • Evans, R. J. (2000). In defense of history. New York, NY: Norton.
  • Farr, R., Pritchard, R., & Smitten, B. (1990). A description of what happens when an examinee takes a multiple-choice reading comprehension test. Journal of Educational Measurement, 27, 209–226. doi:10.1111/j.1745-3984.1990.tb00744.x
  • Fischer, D. H. (1970). Historians’ fallacies: Toward a logic of historical thought. New York: Harper Perennial.
  • Frederiksen, N. (1984). The real test bias: Influences of testing on teaching and learning. American Psychologist, 39, 193–202. doi:10.1037/0003-066X.39.3.193
  • Gierl, M. J., Bulut, O., Guo, Q., & Zhang, X. (2017). Developing, analyzing, and using distractors for multiple-choice tests in education: A comprehensive review. Review of Educational Research, 87, 1082–1116. doi:10.3102/0034654317726529
  • Gordon Commission on the Future of Assessment in Education. (2013). A public policy statement. Retrieved from http://gordoncommission.org/rsc/pdfs/gordon_commission_public_policy_report.pdf
  • Gradwell, J. M. (2006). Teaching in spite of, rather than because of, the test. In S. G. Grant (Ed.), Measuring history: Cases of state-level testing across the United States (pp. 157–176). Greenwich, CT: Information Age.
  • Grant, S. G. (2006). Research on history tests. In S. G. Grant (Ed.), Measuring history: Cases of state-level testing across the United States (pp. 29–56). Greenwich, CT: Information Age.
  • Grant, S. G., Gradwell, J. M., & Cimbricz, S. K. (2004). A question of authenticity: The document-based question as an assessment of students’ knowledge of history. Journal of Curriculum and Supervision, 19, 309–337.
  • Greeno, J., Collins, A., & Resnick, L. (1996). Cognition and learning. In D. Berliner & R. Calfee (Eds.), Handbook of educational psychology (pp. 15–47). New York: Simon & Schuster Macmillan.
  • Guilford, J. P. (1959). Three faces of intellect. American Psychologist, 14, 469–479. doi:10.1037/h0046827
  • Haladyna, T. M. (2004). Developing and validating multiple-choice test items (3rd ed.). London: Erlbaum.
  • Haladyna, T. M., & Downing, S. M. (1989). Validity of a taxonomy of multiple-choice item-writing rules. Applied Measurement in Education, 2, 51–78. doi:10.1207/s15324818ame0201_4
  • Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and validating test items. New York, NY: Routledge.
  • Hamilton, L. S., Nussbaum, E. M., & Snow, R. E. (1997). Interview procedures for validating science assessments. Applied Measurement in Education, 10, 181–200. doi:10.1207/s15324818ame1002_5
  • Holt, T. (1990). Thinking historically: Narrative, imagination, and understanding. New York: College Entrance Examination Board.
  • Hasso Plattner Institute of Design at Stanford.. (n.d.). Virtual crash course in design thinking. Retrieved from https://dschool.stanford.edu/dgift/
  • Kaliski, P., Smith, K., & Huff, K. (2015). The importance of construct validity evidence in history assessment: What is often overlooked or misunderstood? In K. Ercikan & P. Seixas (Eds.), New directions in assessing historical thinking (pp. 195–205). New York: Routledge.
  • Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational measurement (4th ed., pp. 17–64). Westport, CT: Praeger.
  • Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50, 1–73. doi:10.1111/jedm.12000
  • Katz, I. R., Bennett, R. E., & Berger, A. E. (2000). Effects of response format on difficulty of SAT-mathematics items: It’s not the strategy. Journal of Educational Measurement, 37, 39–57. doi:10.1111/j.1745-3984.2000.tb01075.x
  • Kelly, F. J. (1914). Teachersʾ marks: Their variability and standardization (Unpublished doctoral dissertation.) Teachers College, Columbia University: New York, NY.
  • Kelly, F. J. (1916). The Kansas silent reading tests. Journal of Educational Psychology, 7, 63–80. doi:10.1037/h0073542
  • Körber, A. (2011). German history didactics: From historical consciousness to historical competencies–and Beyond. In H. Bjerg, C. Lenz, & E. Thorstensen (Eds.), Historicizing the uses of the past: Scandinavian perspectives on history culture, historical consciousness and didactics of history related to World War II (pp. 145–164). London: Transaction.
  • Körber, A., & Meyer-Hamme, J. (2015). Historical thinking, competencies, and their measurement. In K. Ercikan & P. Seixas (Eds.), New directions in assessing historical thinking (pp. 89–101). New York: Routledge.
  • Koretz, D. (2008). Measuring up. Boston: Harvard University Press.
  • Kuusela, H., & Pallab, P. (2000). A comparison of concurrent and retrospective verbal protocol analysis. The American Journal of Psychology, 113, 387–404. doi:10.2307/1423365
  • Leighton, J. P. (2004). Avoiding misconception, misuse, and missed opportunities: The collection of verbal reports in educational achievement testing. Educational Measurement: Issues and Practice, 23, 6–15. doi:10.1111/j.1745-3992.2004.tb00164.x
  • Leinhardt, G., & Young, K. M. (1996). Two texts, three readers: Distance and expertise in reading history. Cognition and Instruction, 14, 441–486. doi:10.1207/s1532690xci1404_2
  • Lévesque, S. (2008). Thinking historically: Educating students for the twenty-first century. Toronto: University of Toronto Press.
  • Levstik, L., & Barton, K. C. (1996). ‘They still use some of their past’: Historical salience in elementary children’s chronological thinking. Journal of Curriculum Studies, 28, 531–576. doi:10.1080/0022027980280502
  • Madaus, G., West, M., Harmon, M., Lomax, R., & Viator, K. (1992). The influence of testing on teaching math and science in grades 4–12. Boston: Boston College, Center for the Study of Testing, Evaluation, and Educational Policy.
  • Mandell, N. (2008). Thinking like a historian: A framework for teaching and learning. OAH Magazine of History, 22, 55–59. doi:10.1093/maghis/22.2.55
  • Martin, D., Maldonado, S. I., Schneider, J., & Smith, M. (2011). A report on the state of history education: State policies and national programs. National History Education Clearinghouse. Retrieved from http://teachinghistory.org/system/files/teachinghistory_special_report_2011.pdf
  • Martin, R. (2009). The design of business: Why design thinking is the next competitive advantage. Boston: Harvard Business School Publishing.
  • Martinez, M. E. (1999). Cognition and the question of test item format. Educational Psychologist, 34, 207–218. doi:10.1207/s15326985ep3404_2
  • Mink, L. O. (1987). Historical understanding. Ithaca, NY: Cornell.
  • Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2004). A brief introduction to evidence-centered design (CSE Report 632). Los Angeles: Center for Research on Evaluation, Standards, and Student Testing.
  • Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice, 25, 6–20. doi:10.1111/j.1745-3992.2006.00075.x
  • Mislevy, R. J., & Riconscente, M. M. (2006). Evidence-centered assessment design: Layers, concepts, and terminology. In S. Downing & T. Haladyna (Eds.), Handbook of test development (pp. 61–90). Mahwah, NJ: Erlbaum.
  • Monte-Sano, C., & De La Paz, S. (2012). Using writing tasks to elicit adolescents’ historical reasoning. Journal of Literacy Research, 44, 273–299. doi:10.1177/1086296X12450445
  • Mosborg, S. (2002). Speaking of history: How adolescents use their knowledge of history in reading the daily news. Cognition and Instruction, 20, 323–358. doi:10.1207/S1532690XCI2003_2
  • National Center for History in the Schools. (n.d.). Historical thinking standards. Retrieved from http://www.nchs.ucla.edu/Standards/historical-thinking-standards-1
  • National Governors Association, & Council of Chief State School Officers. (2010). Common core state standards for English language arts & literacy in history/social studies, science, and technical subjects. Retrieved from http://www.corestandards.org/assets/CCSSI_ELA%20Standards.pdf
  • National Research Council. (2013). Education for life and work: Developing transferable knowledge and skills in the 21st century. Washington, DC: National Academies Press.
  • Nisbett, R., & Wilson, T. (1977). Telling more than we can know: Verbal reports on mental processes. Psychological Review, 84, 231–259. doi:10.1037/0033-295X.84.3.231
  • Nokes, J. D., Dole, J. A., & Hacker, D. J. (2007). Teaching high school students to use heuristics while reading historical texts. Journal of Educational Psychology, 99, 492–495. doi:10.1037/0022-0663.99.3.492
  • Ohio Department of Education. (2010). Ohio’s new learning standards: Social studies standards. Retrieved from https://education.ohio.gov/getattachment/Topics/Ohio-s-New-Learning-Standards/Social-Studies/SS-Standards.pdf.aspx
  • Osterlind, S. J. (1998). Constructing test items: Multiple-choice, constructed response, performance, and other formats. Boston: Kluwer.
  • Paxton, R. J. (2003). Don’t know much about history – never did. Phi Delta Kappan, 85, 264–273. doi:10.1177/003172170308500405
  • Pellegrino, J., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academies Press.
  • Pellegrino, J., DiBello, L., & Goldman, S. (2016). A framework for conceptualizing and evaluating the validity of instructionally relevant assessments. Educational Psychologist, 51, 59–81. doi:10.1080/00461520.2016.1145550
  • Popham, W. J. (2003). Test better, teach better: The instructional role of assessment. Alexandria, VA: ASCD.
  • Pressley, M., & Afflerbach, P. (1995). Verbal protocols of reading: The nature of constructively responsive reading. Hillsdale, NJ: Erlbaum.
  • Rantala, J. (2012). How Finnish adolescents understand history: Disciplinary thinking in history and its assessment among 16-year-old Finns. Education Sciences, 2, 193–207. doi:10.3390/educsci2040193
  • Reich, G. A. (2009). Testing historical knowledge: Standards, multiple-choice questions and student reasoning. Theory and Research in Social Education, 37, 325–360. doi:10.1080/00933104.2009.10473401
  • Reisman, A. (2012). Reading like a historian: A document-based history curriculum intervention in urban high schools. Cognition and Instruction, 30, 86–112. doi:10.1080/07370008.2011.634081
  • Resnick, L., & Resnick, D. (1992). Assessing the thinking curriculum: New tools for educational reform. In B. Gifford & M. O'Connor (Eds.), Changing assessments: Alternative views of aptitude, achievement, and instruction (pp. 37–55). Boston: Kluwer.
  • Robelen, E. (2011, February 4). What to expect from the revised AP U.S. History program. Education Week. Retrieved from http://blogs.edweek.org/edweek/curriculum/2011/02/what_to_expect_from_the_revise.html
  • Rodriguez, M. C. (2011). Item-writing practice and evidence. In S. N. Elliott, R. J. Kettler, P. A. Bedlow, & A. Kurz (Eds.), Handbook of accessible achievement tests for all students: Bridging the gaps between research, policy, and practice (pp. 201–216). New York, NY: Springer.
  • Rouet, J. F., Favart, M., Britt, M. A., & Perfetti, C. A. (1997). Studying and using multiple documents in history: Effects of discipline expertise. Cognition and Instruction, 15, 85–106. doi:10.1207/s1532690xci1501_3
  • Ruiz-Primo, M. A., Shavelson, R. J., Li, M., & Schultz, S. (2001). On the validity of cognitive interpretations of scores from alternative concept-mapping techniques. Educational Assessment, 7, 99–141. doi:10.1207/S15326977EA0702_2
  • Rüsen, J. (1987). Historical narration: Foundation, types, reason. History and Theory, 26, 87–97. doi:10.2307/2505047
  • Samuelsson, J., & Wendell, J. (2016). Historical thinking about sources in the context of a standards-based curriculum: A Swedish case. The Curriculum Journal, 27, 479–499. doi:10.1080/09585176.2016.1195275
  • Seixas, P. (2017). A model of historical thinking. Educational Philosophy and Theory, 49, 593–605. doi:10.1080/00131857.2015.1101363
  • Seixas, P., Gibson, L., & Ercikan, K. (2015). A design process for assessing historical thinking: The case of a one-hour test. In K. Ercikan & P. Seixas (Eds.), New directions in assessing historical thinking (pp. 102–116). New York: Routledge.
  • Seixas, P., Morton, T., Colyer, J., & Fornazzari, S. (2013). The big six: Historical thinking concepts. Toronto: Nelson Education.
  • Seixas, P., & Peck, C. (2004). Teaching historical thinking. In A. Sears & I. Wright (Eds.), Challenges and prospects for Canadian social studies (pp. 109–117). Vancouver: Pacific Education Press.
  • Shanahan, C., Shanahan, T., & Misischia, C. (2011). Analysis of expert readers in three disciplines: History, mathematics, and chemistry. Journal of Literacy Research, 43, 393–429. doi:10.1177/1086296X11424071
  • Shanahan, T., & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review, 78, 40–59. doi:10.17763/haer.78.1.v62444321p602101
  • Shemilt, D. (1983). The devil’s locomotive. History and Theory, 22, 1–18. doi:10.2307/2505213
  • Shepard, L. A. (1992). Will national tests improve student learning? (CSE Technical Report 342). Los Angeles: Center for Research on Evaluation, Standards, and Student Testing.
  • Shepard, L. A. (2000). The role of assessment in a learning culture. Educational Researcher, 29, 4–14. doi:10.3102/0013189X029007004
  • Shreiner, T. L. (2014). Using historical knowledge to reason about contemporary political issues: An expert–novice study. Cognition and Instruction, 32, 313–352. doi:10.1080/07370008.2014.948680
  • Smith, M. D. (2017). Cognitive validity: Can multiple-choice items tap historical thinking processes? American Educational Research Journal, 54, 1256–1287. doi:10.3102/0002831217717949
  • Stiggins, R. (2002). Assessment crisis: The absence of assessment for learning. Phi Delta Kappan, 83, 758–765. doi:10.1177/003172170208301010
  • Taylor, K. L., & Dionne, J. P. (2000). Accessing problem-solving strategy knowledge: The complementary use of concurrent verbal protocols and retrospective debriefing. Journal of Educational Psychology, 92, 413–425. doi:10.1037/0022-0663.92.3.413
  • van Drie, J., & van Boxtel, C. (2008). Historical reasoning: Towards a framework for analyzing students’ reasoning about the past. Educational Psychology Review, 20, 87–110. doi:10.1007/s10648-007-9056-1
  • VanSledright, B. A. (2004). What does it mean to think historically … and how do you teach it? Social Education, 68, 230–233.
  • VanSledright, B. (2014). Assessing historical thinking and understanding: Innovative designs for new standards. New York: Taylor & Francis.
  • von Borries, B. (1997). Concepts of historical thinking and historical learning in the perspective of German students and teachers. International Journal of Educational Research, 27, 211–220. doi:10.1016/S0883-0355(97)89729-7
  • Voss, J., & Wiley, J. (2006). Expertise in history. In K. A. Ericsson, N. Charness, P. Feltovich, & R. R. Hoffman (Eds.), Cambridge handbook of expertise and expert performance. New York: Cambridge.
  • Wainer, H. (2011). Uneducated guesses: Using evidence to uncover misguided education policies. Princeton, NJ: Princeton University Press.
  • Waldis, M., Hodel, J., Thünemann, H., Zülsdorf-Kersting, M., & Ziegler, B. (2015). Material-based and open-ended writing tasks for assessing narrative competence among students. In K. Ercikan & P. Seixas (Eds.), New directions in assessing historical thinking (pp. 117–131). New York: Routledge.
  • White, P. (1980). Limitations on verbal reports of internal events: A refutation of Nisbett and Wilson and of Bem. Psychological Review, 87, 105–112. doi:10.1037/0033-295X.87.1.105
  • Wineburg, S. (1991). Historical problem solving: A study of the cognitive processes used in the evaluation of documentary and pictorial evidence. Journal of Educational Psychology, 83, 73–87. doi:10.1037/0022-0663.83.1.73
  • Wineburg, S. (1998). Reading Abraham Lincoln: An expert/expert study in the interpretation of historical texts. Cognitive Science, 22, 319–346. doi:10.1207/s15516709cog2203_3
  • Wineburg, S. (2001). Historical thinking and other unnatural acts: Charting the future of teaching the past. Philadelphia: Temple.
  • Wineburg, S. (2004). Crazy for history. Journal of American History, 90, 1401–1414. doi:10.2307/3660360
  • Young, K. M., & Leinhardt, G. (1998). Writing from primary documents: A way of knowing history. Written Communication, 15, 25–68. doi:10.1177/0741088398015001002

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.