381
Views
0
CrossRef citations to date
0
Altmetric
Articles

Validating theoretical assumptions about reading with cognitive diagnosis models

&
Pages 105-129 | Received 30 Mar 2020, Accepted 13 Apr 2021, Published online: 09 Jun 2021

References

  • Bolt, D. M., & Kim, J. S. (2018). Parameter invariance and skill attribute continuity in the DINA model. Journal of Educational Measurement, 55(2), 264–280.
  • Bos W., Postlethwaite T. N., & Gebauer M. M. (2010). Potenziale, Grenzen und Perspektiven internationaler Schulleistungsforschung [Potencials, limits and perspectives in international school development research]. In R. Tippelt & B. Schmidt (Eds.), Handbuch Bildungsforschung [Handbook of school development research] (pp. 275–295). VS Verlag für Sozialwissenschaften.
  • Bos, W., Valtin, R., Voss, A., Hornberg, S., & Lankes, E.-M. (2007). Konzepte der Lesekompetenz in IGLU 2006 [Concepts of reading competence in IGLU 2006]. In W. Bos, S. Hornberg, K.-H. Arnold, G. Faus, L. Fried, E.-M. Lankes, K. Schwippert, & R. Valtin (Eds.), IGLU 2006. Lesekompetenzen von Grundschulkindern in Deutschland im internationalen Vergleich [IGLU 2006: International comparison of reading competence of elementary school students in Germany] (pp. 81–104). Waxmann.
  • Bremerich-Vos, A. (1996). Aspekte des Schriftspracherwerbs – Stufentheorie, das “Neue” und die Lehrer-Schüler-Interaktion [Aspects in written language accisition – Steptheory, “news” and teacher student interactions]. In A. Peyer & P. Portmann (Eds.), Norm, Moral und Didaktik – Die Linguistik und ihre Schmuddelkinder. Eine Aufforderung zur Diskussion [Norms, moral and didactics – linguistics and it's ragamuns: A demand for discussion] (pp. 267–290). Niemeyer.
  • Campbell, J., Kelly, D., Mullis, I., Martin, M., & Sainsbury, M. (2001). Framework and specifications for PIRLS assessment 2001. Boston College.
  • Chen, H., & Chen, J. (2013). Validating G-DINA model in language test diagnosis. Journal of Psychological Science, 36, 1470–1475.
  • Chen, H., & Chen, J. (2016). Exploring reading comprehension skill relationships through the G-DINA model. Educational Psychology, 36(6), 1049–1064.
  • Chen, J., & de la Torre, J. (2014). A procedure for diagnostically modeling extant large-scale assessment data: The case of the programme for international student assessment in reading. Psychology, 5, 1967–1978. https://doi.org/10.4236/psych.2014.518200
  • Chen, Y., Liu, J., Xu, G., & Ying, Z. (2015). Statistical analysis of Q-matrix based diagnostic classification models. Journal of the American Statistical Association, 110(510), 850–866.
  • de la Torre, J. (2009). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34(1), 115–130. https://doi.org/10.3102/1076998607309474
  • de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76(2), 179–199.
  • de la Torre, J., & Chiu, C.-Y. (2016). A general method of empirical Q-matrix validation. Psychometrika, 81(2), 253–273.
  • de la Torre, J., Hong, Y., & Deng, W. (2010). Factors affecting the item parameter estimation and classification accuracy of the DINA model. Journal of Educational Measurement, 47(2), 227–249.
  • DiBello, L., Roussos, L., & Stout, W. (2007). Review of cognitively diagnostic assessment and a summary of psychometric models. In C. R. Rao & S. Sinharay (Eds.), Handbook of statistics (Vol. 26, Psychometrics, pp. 979–1030). Elsevier.
  • George, A., Bley, S., & Pellegrino, J. (2019). Characterizing and diagnosing complex professional competencies – an example of intrapreneurship. Educational Measurement: Issues and Practice, 38(2), 89–100.
  • George, A. C., Robitzsch, A., Kiefer, T., Groß, J., & Ünlü, A. (2016). The R package CDM for cognitive diagnosis models. Journal of Statistical Software, 74, 1–24.
  • George, A. C., Robitzsch, A., Krelle, M., & Breit, S. (2019). Ein empirischer Vergleich von Konzepten der Lesekompetenz in PIRLS [An empirical comparison of reading concepts]. In U. Itzlinger-Bruneforth & C. Wallner-Paschon (Eds.), Lesekompetenz der 10-Jährigen im Trend. Vertiefende Analysen zu PIRLS [Reading competences of 10 year old stundents: Expert analyses] (pp. 53–68). Leykam.
  • Gierl, M. J., Alves, C., & Majeau, R. T. (2010). Using the attribute hierarchy method to make diagnostic inferences about examinees’ knowledge and skills in mathematics: An operational implementation of cognitive diagnostic assessment. International Journal of Testing, 10(4), 318–341.
  • Gilula, Z., & Haberman, S. J. (1994). Conditional log-linear models for analyzing categorical panel data. Journal of the American Statistical Association, 89(426), 645–656.
  • Goodrich, S. (2018). Application of a log linear diagnostic model to the 2011 progress in international reading literacy (Unpublished dissertation). The University of British Columbia.
  • Gu, Y., & Xu, G. (2019). Learning attribute patterns in high-dimensional structured latent attribute models. Journal of Machine Learning Research, 20(115), 1–58.
  • Haertel, E. H. (1989). Using restricted latent class models to map the skill structure of achievement items. Journal of Educational Measurement, 26(4), 301–321. https://doi.org/10.1111/j.1745-3984.1989.tb00336.x
  • Hartz, S. M. (2002). A Bayesian framework for the unified model for assessing cognitive abilities: Blending theory with practicality (Ph.D. thesis). University of Illinois Urbana Champaign, IL.
  • Irwin, J. W. (1986). Teaching reading comprehension process. Prentice-Hall Inc.
  • Jang, E. E. (2009). Cognitive diagnostic assessment of L2 reading comprehension ability: Validity arguments for Fusion Model application to LanguEdge assessment. Language Testing, 26(1), 31–73. https://doi.org/10.1177/0265532208097336
  • Jurich, D. P., & Bradshaw, L. P. (2014). An illustration of diagnostic classification modeling in student learning outcomes assessment. International Journal of Testing, 14(1), 49–72.
  • Kim, A.-Y. (2015). Exploring ways to provide diagnostic feedback with an ESL placement test: Cognitive diagnostic assessment of L2 reading ability. Language Testing, 32(2), 227–258. https://doi.org/10.1177/0265532214558457
  • Lee, Y.-S., Park, Y. S., & Taylan, D. (2011). A cognitive diagnostic modeling of attribute mastery in Massachusetts, Minnesota, and the U.S. national sample using the TIMSS 2007. International Journal of Testing, 11(2), 144–177.
  • Leighton, J. P., Gierl, M. J., & Hunka, S. M. (2004). The attribute hierarchy method for cognitive assessment: A variation on Tatsuoka's rule-space approach. Journal of Educational Measurement, 41(3), 205–237.
  • Li, H., Hunter, C. V., & Lei, P.-W. (2016). The selection of cognitive diagnostic models for a reading comprehension test. Language Testing, 33(3), 391–409.
  • Li, H., & Suen, H. (2013). Constructing and validating a Q-matrix for cognitive diagnostic analyses of a reading test. Educational Assessment, 18(1), 1–25.
  • Ma, W., & de la Torre, J. (2016). A sequential cognitive diagnosis model for polytomous responses. British Journal of Mathematical and Statistical Psychology, 69(3), 253–275.
  • Maydeu-Olivares, A. (2013). Goodness-of-fit assessment of item response theory models. Measurement: Interdisciplinary Research and Perspectives, 11, 71–137.
  • Mullis, I. V. S., Kennedy, A. M., Martin, M. O., & Sainsbury, M. (2006). PIRLS 2006. Assessment framework and specifications (2nd ed). TIMSS & PIRLS. International Study Center, Lynch School of Education, Boston College.
  • Mullis, I. V. S., Martin, M. O., Foy, P., & Hooper, M. (Eds.). (2017). PIRLS 2016. International results in reading. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.
  • R Core Team. (2020). R: A language and environment for statistical computing. R Foundation for Statistical Computing.
  • Reckase, M. D. (2009). Multidimensional item response theory. Springer.
  • Ravand, H. (2016). Application of a cognitive diagnosis model to a high-stakes reading comprehension test. Journal of Psychoeducational Assessment, 34(8), 782–799. https://doi.org/10.1177/0734282915623053
  • Robitzsch, A., Kiefer, T., George, A. C., & Ünlü, A. (2020). CDM: Cognitive diagnosis modeling. R package version 7.5-15. http://CRAN.R-project.org/package=CDM
  • Rutkowski, L., Gonzalez, E., Joncas, M., & von Davier, M. (2010). International large-scale assessment data: Issues in secondary analysis and reporting. Educational Researcher, 39(2), 142–151.
  • Sawaki, Y., Kim, H.-J., & Gentile, C. (2009). Q-Matrix construction: Defining the link between constructs and test items in large-scale reading and listening comprehension assessments. Language Assessment Quarterly, 6, 190–209.
  • Svetina, D., Gorin, J. S., & Tatsuoka, K. K. (2011). Defining and comparing the reading comprehension construct: A cognitive-psychometric modeling approach. International Journal of Testing, 11(1), 1–23.
  • Tatsuoka, K. K. (1983). Rule space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345–354. https://doi.org/10.1111/j.1745-3984.1983.tb00212.x
  • TIMSS and PIRLS International Study Center. (2018). PIRLS 2016. User guide for the international database. https://timssandpirls.bc.edu/pirls2016/international-database/index.html
  • van Rijn, P. W., Sinharay, S., Haberman, S. J., & Johnson, M. S. (2016). Assessment of fit of item response theory models used in large‑scale educational survey assessments. Large-scale Assessments in Education, 4, 10.
  • von Davier, M. (2008). A general diagnostic model applied to language testing data. British Journal of Mathematical and Statistical Psychology, 61(2), 287–307. https://doi.org/10.1348/000711007X193957
  • von Davier M., & Lee, Y.-S. (2019). Handbook of diagnostic classification models. Springer.
  • Voss, A., Carstensen, C. H., & Bos, W. (2005). Textgattungen und Verstehensaspekte: Analyse von Leseverständnis aus den Daten der IGLU-Studie [Text genres and aspects of comprehension: Anlaysis of reading comprehension based on data of IGLU]. In W. Bos, E. M. Lankes, M. Prenzel, K. Schwippert, R. Valtin, & G. Walther (Eds.), IGLU: Vertiefende Analysen zu Leseverständnis, Rahmenbedingungen und Zusatzstudien [IGLU: Advanced analysis of reading comprehension, framework and addtional studies] (pp. 1–33). Waxmann.
  • Wang, C., & Gierl, M. J. (2011). Using the attribute hierarchy method to make diagnostic inferences about examinees’ cognitive skills in critical reading. Journal of Educational Measurement, 48(2), 165–187.
  • Wu, X., Wu, R., Chang, H.-H., Kong, Q., & Zhang, Y. (2020). International comparative study on PISA mathematics achievement test based on cognitive diagnostic models. Frontiers in Psychology, 11, 2230. https://doi.org/10.3389/fpsyg.2020.02230
  • Yu, X., Cheng, Y., & Chang, H.-H. (2019). Recent developments in cognitive diagnostic computerized adaptive testing (CD-CAT): A comprehensive review. In M. von Davier & Y.-S. Lee (Eds.), Handbook of diagnostic classification models (pp. 307–331). Springer.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.