165
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Unpacking Response Process Issues Encountered When Developing a Mathematics Teachers’ Pedagogical Content Knowledge (PCK) Assessment

, , &

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Author.
  • Baumert, J., Kunter, M., Blum, W., Brunner, M., Voss, T., Jordan, A., Klusmann, U., Krauss, S., Neubrand, M., & Tsai, Y. M. (2010). Teachers’ mathematical knowledge, cognitive activation in the classroom, and student progress. American Educational Research Journal, 47(1), 133–180. https://doi.org/10.3102/0002831209345157
  • Bonner, S., Chen, P., Jones, K., & Milonovich, B. (2021). Formative assessment of computational thinking: Cognitive and metacognitive processes. Applied Measurement in Education, 34(1), 27–45. https://doi.org/10.1080/08957347.2020.1835912
  • Bostic, J. D. (2021). Think alouds: Informing scholarship and broadening partnerships through assessment. Applied Measurement in Education, 34(1), 1–9. https://doi.org/10.1080/08957347.2020.1835914
  • Bostic, J. D., Sondergeld, T. A., Matney, G., Stone, G., & Hicks, T. (2021). Gathering response process data for a problem-solving measure through whole-class think alouds. Applied Measurement in Education, 14(1), 46–60. https://doi.org/10.1080/08957347.2020.1835913
  • de Bock, D., Van Dooren, W., Janssens, D., & Verschaffel, L. (2002). Improper use of linear reasoning: An in-depth study of the nature and the irresistibility of secondary school students’ errors. Educational Studies in Mathematics, 50(3), 311–334. https://doi.org/10.1023/A:1021205413749
  • Deng, J. M., Streja, N., & Flynn, A. B. (2021). Response process validity evidence in chemistry education research. Journal of Chemical Education, 98(12), 3656–3666. https://doi.org/10.1021/acs.jchemed.1c00749
  • Dolan, R. P., Burling, K., Harms, M., Strain-Seymour, E., Way, W. D., & Rose, D. H. (2013). A universal design for learning-based framework for designing accessible technology-enhanced assessments. Pearson Assessment Research Report. http://images.pearsonclinical.com/images/tmrs/DolanUDL-TEAFramework_final3.pdf
  • Frey, B. B., Petersen, S., Edwards, L. M., Pedrotti, J. T., & Peyton, V. (2005). Item-writing rules: Collective wisdom. Teaching and Teacher Education, 21(4), 357–364. https://doi.org/10.1016/j.tate.2005.01.008
  • Haladyna, T. M., Downing, S. M., & Rodriguez, M. C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15(3), 309–333. https://doi.org/10.1207/S15324818AME1503_5
  • Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and validating test items. Routledge.
  • Hill, H. C., Ball, D. L., & Schilling, S. G. (2008). Unpacking pedagogical content knowledge: Conceptualizing and measuring teachers’ topic-specific knowledge of students. Journal for Research in Mathematics Education, 39(4), 372–400. https://doi.org/10.5951/jresematheduc.39.4.0372
  • Hill, H. C., Dean, C., & Goffney, I. M. (2007). Assessing elemental and structural validity: Data from teachers, non-teachers, and mathematicians. Measurement, 5(2–3), 81–92. https://doi.org/10.1080/15366360701486999
  • Hogan, T. P., & Murphy, G. (2007). Recommendations for preparing and scoring constructed-response items: What the experts say. Applied Measurement in Education, 20(4), 427–441. https://doi.org/10.1080/08957340701580736
  • Izsák, A., Remillard, J. T., Templin, J. (Eds.). (2016). Psychometric methods in mathematics education: Opportunities, challenges, and interdisciplinary collaborations. Journal for Research in Mathematics Education monograph series. National Council of Teachers of Mathematics.
  • Kersting, N. (2008). Using video clips of mathematics classroom instruction as item prompts to measure teachers’ knowledge of teaching mathematics. Educational and Psychological Measurement, 68(5), 845–861. https://doi.org/10.1177/0013164407313369
  • Kersting, N. B., Givvin, K. B., Thompson, B. J., Santagata, R., & Stigler, J. W. (2012). Measuring usable knowledge: Teachers’ analyses of mathematics classroom videos predict teaching quality and student learning. American Educational Research Journal, 49(3), 568–589. https://doi.org/10.3102/0002831212437853
  • Kim, O. K., & Remillard, J. T. (2011). Conceptualizing and assessing curriculum embedded mathematics knowledge. In Annual meeting of the American educational research association. LA. https://icubit.gse.upenn.edu/sites/default/files/CEMA.pdf
  • Mo, Y., Carney, M., Cavey, L., & Totorica, T. (2021). Using think-alouds for response process evidence of teacher attentiveness. Applied Measurement in Education, 34(1), 10–26. https://doi.org/10.1080/08957347.2020.1835910
  • Mosvold, R., & Hoover, M. (2016). The Mathematics Enthusiast, 13(1–2). https://doi.org/10.54870/1551-3440.1362
  • Mullis, I. V. S., & Martin, M. O. (2013). TIMSS 2015 item writing guidelines. In International Association for the Evaluation of Educational Achievement. Lynch School of Education.
  • National Governors Association Center for Best Practices, & Council of Chief State School Officers. (2010). Common core state standards for mathematics: Grade 6 introduction. http://www.corestandards.org/Math/Content/6/introduction/
  • Orrill, C. H., & Cohen, A. 2016. Purpose and conceptualization: Examining assessment development questions through analysis of measures of teacher knowledge. In: A. Izsák, J. T. Remillard & J. Templin (Eds.), Psychometric methods in mathematics education: Opportunities, challenges, and interdisciplinary collaborations. Journal for Research in Mathematics Education Monograph Series No. 15 (pp. 139–153). Reston, VA: National Council of Teachers of Mathematics.
  • Orrill, C. H., Kim, O. -K., Peters, S. A., Lischka, A. E., Jong, C., Sanchez, W. B., & Eli, J. A. (2015). Challenges and strategies for assessing specialised knowledge for teaching. Mathematics Teacher Education & Development, 17(1), 12–29.
  • Padilla, J. L., & Benítez, I. (2014). Validity evidence based on response processes. Psicothema, 26(1), 136–144. https://doi.org/10.7334/psicothema2013.259
  • Rowan, B., Schilling, S. G., Ball, D. L., Miller, R., Atkins-Burnett, S., & Camburn, E. (2001). Measuring teachers’ pedagogical content knowledge in surveys: An exploratory study. Consortium for Policy Research in Education.
  • Schilling, S. G., Blunk, M., & Hill, H. C. (2007). Test validation and the MKT measures: Generalizations and conclusions. Measurement, 5(2–3), 118–128. https://doi.org/10.1080/15366360701487146
  • Schilling, S. G., & Hill, H. C. (2007). Assessing measures of mathematical knowledge for teaching: A validity argument approach. Measurement, 5(2–3), 70–80. https://doi.org/10.1080/15366360701486965
  • Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14. https://doi.org/10.3102/0013189X015002004
  • Zhai, X., Haudek, K. C., Wilson, C., & Stuhlsatz, M. (2021). A framework of construct-irrelevant variance for contextualized constructed response assessment. Frontiers in Education, 6. https://doi.org/10.3389/feduc.2021.751283

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.