Publication Cover
Educational Psychology
An International Journal of Experimental Educational Psychology
Volume 36, 2016 - Issue 6: Cognitive Diagnostic Assessment
1,393
Views
16
CrossRef citations to date
0
Altmetric
Articles

Cognitive diagnostic models for tests with multiple-choice and constructed-response items

, , &
Pages 1115-1133 | Received 17 Jul 2014, Accepted 09 Mar 2016, Published online: 18 Apr 2016

References

  • Arieli-Attali, M., & Liu, Y. (2015). Beyond correctness: Development and validation of concept-based categorical scoring rubrics for diagnostic purposes. Educational Psychology. Advance online publication. doi: 10.1080/01443410.2015.1031088
  • Ashlock, R. B. (1994). Error patterns in computation (6th ed.). Englewood Cliffs, NJ: Prentice Hall.
  • Attali, Y., & Burstein, J. (2006). Automated essay scoring with e-raterR V.2. The Journal of Technology, Learning, and Assessment, 4(3). Retrieved from http://napoleon.bc.edu/ojs/index.php/jtla/article/view/1650
  • Attali, Y., Powers, D., Freedman, M., Harrison, M., & Obetz, S. (2008). Automated scoring of short-answer open-ended GRE subject test items ( GRE Board Research Rep. No GRE-04-02). Princeton, NJ: ETS.
  • Bradshaw, L., & Templin, J. (2014). Combining item response theory and diagnostic classification models: A psychometric model for scaling ability and diagnosing misconceptions. Psychometrika, 79, 403–425. doi:10.1007/s11336-013-9350-4
  • Carlin, B. P., & Louis, T. A. (2000). Bayes and empirical Bayes methods for data analysis. New York, NY: Chapman & Hall.10.1201/CHTEXSTASCI
  • Chen, P., Xin, T., Wang, C., & Chang, H. H. (2012). Online calibration methods for the DINA model with independent attributes in CD-CAT. Psychometrika, 77, 201–222. doi:10.1007/s11336-012-9255-7
  • de la Torre, J. (2009a). A cognitive diagnosis model for cognitively based multiple-choice options. Applied Psychological Measurement, 33, 163–183. doi:10.1177/0146621608320523
  • de la Torre, J. (2009b). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115–130. doi:10.3102/1076998607309474
  • de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76, 179–199. doi:10.1007/s11336-011-9207-7
  • de la Torre, J., & Douglas, J. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69, 333–353. doi:10.1007/BF02295640
  • de la Torre, J., & Lee, Y.-S. (2010). A note on the invariance of the DINA model parameters. Journal of Educational Measurement, 47, 115–127. doi:10.1111/j.1745-3984.2009.00102.x
  • de la Torre, J., van der Ark, L. A., & Rossi, G. (2015). Analysis of clinical data from cognitive diagnosis modeling framework. Measurement and Evaluation in Counseling and Development, online first. doi:10.1177/0748175615569110.
  • Doornik, J. A. (2002). Object-oriented matrix programming using Ox (Version 3.1). [Computer software]. London: Timberlake Consultants Press.
  • Ercikan, K., Sehwarz, R. D., Julian, M. W., Burket, G. R., Weber, M. M., & Link, V. (1998). Calibration and scoring of tests with multiple-choice and constructed-response item types. Journal of Educational Measurement, 35, 137–154. doi:10.1111/j.1745-3984.1998.tb00531.x
  • Feasel, K., Henson, R., & Jones, L. (2004). Analysis of the gambling research instrument (GRI). Unpublished manuscript.
  • Golke, S., Dörfler, T., & Artelt, C. (2015). The impact of elaborated feedback on text comprehension within a computer-based assessment. Learning and Instruction, 39, 123–136. doi:10.1016/j.learninstruc.2015.05.009
  • Haertel, E. H. (1989). Using restricted latent class models to map the skill structure of achievement items. Journal of Educational Measurement, 26, 301–321. doi:10.1111/j.1745-3984.1989.tb00336.x
  • Harks, B., Rakoczy, K., Hattie, J., Besser, M., & Klieme, E. (2014). The effects of feedback on achievement, interest and self-evaluation: The role of feedback’s perceived usefulness. Educational Psychology, 34, 269–290. doi:10.1080/01443410.2013.785384
  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77, 81–112.10.3102/003465430298487
  • Huang, T.-W., & Wu, P.-C. (2013). Classroom-based cognitive diagnostic model for a teacher-made fraction-decimal test. Educational Technology & Society, 16, 347–361.
  • Junker, B. W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258–272. doi:10.1177/01466210122032064
  • Ketterlin-Geller, L. R., & Yovanoff, P. (2009). Diagnostic assessments in mathematics to support instructional decision making. Practical Assessment, Research and Evaluation, 14(16), 1–11.
  • Lee, H. (2016). Which feedback is more effective for pursuing multiple goals of differing importance? The interaction effects of goal importance and performance feedback type on self-regulation and task achievement. Educational Psychology, 36, 297–322. doi:10.1080/01443410.2014.995596
  • Lesh, R., Post, T., & Behr, M. (1988). Proportional reasoning. In J. Hiebert & M. Behr (Eds.), Number concepts and operations in the middle grades (pp. 93–118). Reston, VA: Lawrence Erlbaum & National Council of Teachers of Mathematics.
  • Lord, F. M. (1980). Application of item response theory to practical testing problems. Hillside, NJ: Lawrence Erlbaum. doi:10.4324/9780203056615
  • Lu, C.-H. (2014). The discussion of problem-solving strategies and math performance in the direct proportion unit of the grade school students ( Unpublished master’s thesis). National Taichung University of Education, Taichung, Taiwan.
  • Maris, E. (1999). Estimating multiple classification latent class models. Psychometrika, 64, 187–212. doi:10.1007/BF02294535
  • Modestou, M., & Gagatsis, A. (2007). Students’ improper proportional reasoning: A result of the epistemological obstacle of “linearity”. Educational Psychology, 27, 75–92. doi:10.1080/01443410601061462
  • Neidorf, T. S., Binkley, M., Gattis, K., & Nohara, D. (2006). Comparing mathematics content in the National Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and Program for International Student Assessment (PISA) 2003 assessments ( NCES 2006-029). Washington, DC: National Center for Education Statistics, U.S. Department of Education.
  • Pekrun, R., Cusack, A., Murayama, K., Elliot, A. J., & Thomas, K. (2014). The power of anticipated feedback: Effects on students’ achievement goals and achievement emotions. Learning and Instruction, 29, 115–124. doi:10.1016/j.learninstruc.2013.09.002
  • Roberts, M. R., Alves, C. B., Chu, M.-W., Thompson, M., Bahry, L. M., & Gotzmann, A. (2014). Testing expert-based versus student-based cognitive models for a grade 3 diagnostic mathematics assessment. Applied Measurement in Education, 27, 173–195. doi:10.1080/08957347.2014.905787
  • Rossi, G., Sloore, H., & Derksen, J. (2008). The adaptation of the MCMI-III in two non-English speaking countries: State of the art of the Dutch language version. In T. Millon & C. Bloom (Eds.), The Millon inventories: A practitioner’s guide to personalized clinical assessment (2nd ed., pp. 369–386). New York, NY: Guilford Press.
  • Rupp, A., & Templin, J. (2008). The effects of q-matrix misspecification on parameter estimates and classification accuracy in the DINA model. Educational and Psychological Measurement, 68, 78–96. doi:10.1177/0013164407301545
  • Rupp, A., Templin, J., & Henson, R. A. (2010). Diagnostic measurement: Theory, methods, and applications. New York, NY: Guilford Press.
  • Rust, C. (2007). Towards a scholarship of assessment. Assessment and Evaluation in Higher Education, 32, 229–237. doi:10.1080/02602930600805192
  • Sykes, R. C., & Hou, L. (2003). Weighting constructed-response items in IRT-based exams. Applied Measurement in Education, 16, 257–275. doi:10.1207/S15324818AME1604_1
  • Tatsuoka, K. K. (1983). Rule space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345–354. doi:10.1111/j.1745-3984.1983.tb00212.x
  • Templin, J. L., & Henson, R. A. (2006). Measurement of psychological disorders using cognitive diagnosis models. Psychological Methods, 11, 287–305. doi:10.1037/1082-989X.11.3.287
  • Timmers, C. F., Walraven, A., & Veldkamp, B. P. (2015). The effect of regulation feedback in a computer-based formative assessment on information problem solving. Computers & Education, 87, 1–9. doi:10.1016/j.compedu.2015.03.012
  • Van der Kleij, F. M., Feskens, R. C. W., & Eggen, T. J. H. M. (2015). Effects of feedback in a computer-based learning environment on students’ learning outcomes: A meta-analysis. Review of Educational Research, 85, 475–511. doi:10.3102/0034654314564881
  • Williamson, D. M., Bejar, I. I., & Hone, A. S. (1999). ‘Mental model’ comparison of automated and human scoring. Journal of Educational Measurement, 36, 158–184. doi:10.1111/j.1745-3984.1999.tb00552.x
  • Williamson, D. M., Bejar, I. I., & Sax, A. (2004). Automated tools for subject matter expert evaluation of automated scoring. Applied Measurement in Education, 17, 323–357. doi:10.1207/s15324818ame1704_1
  • Yang, C.-W., Kuo, B.-C., & Liao, C.-H. (2011). A ho-irt based diagnostic assessment system with constructed response items. Turkish Online Journal of Educational Technology, 10, 46–51.
  • Zenisky, A. L., & Sireci, S. G. (2002). Technological innovations in large-scale assessment. Applied Measurement in Education, 15, 337–362. doi:10.1207/S15324818AME1504_02

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.