Publication Cover
Educational Psychology
An International Journal of Experimental Educational Psychology
Volume 36, 2016 - Issue 6: Cognitive Diagnostic Assessment
614
Views
1
CrossRef citations to date
0
Altmetric
Editorial

Editorial

&

Cognitive diagnostic assessments, when appropriately designed, can measure students’ knowledge, attributes, or processing skills, and can provide information about students’ cognitive strengths and weaknesses. With recent advances in cognitive diagnosis modelling techniques and developments in instructional technology, diagnostic measurement in complex learning environments has become more feasible. This special issue covers various aspects of cognitively diagnostic assessments, and includes articles pertaining to innovative approaches to research, practical issues related to the construction of cognitive diagnostic assessments, theories and applications of cognitive diagnosis modelling, and uses digitally-mediated diagnostic assessments. Collectively, the contributors of this special issue have demonstrated how a variety of topics relevant to cognitive diagnostic assessments can be explored.

‘Exploring reading comprehension skill relationships through the G-DINA model’: In Chen and Chen (Citation2016), the generalised deterministic, input, noisy ‘and’ gate (G-DINA; de la Torre, Citation2011) model was applied to analyse the PISA English reading items with five reading comprehension skills, and involving a sample of British secondary school student data. The study found that the highly interactive, and the hierarchical reading comprehension skills can be explained by the G-DINA model.

‘Statistical classification for cognitive diagnostic assessment: An artificial neural network approach’: Cui, Gierl, and Guo (Citation2016) demonstrated the application of artificial neural networks to analyse cognitive diagnostic assessment data. The performance of two types of the artificial neural networks was evaluated under different situations, and the results showed that artificial neural network is a feasible nonparametric classification approach for cognitive diagnostic assessments.

‘Beyond correctness: Development and validation of concept-based categorical scoring rubrics for diagnostic purposes’: Arieli-Attali and Liu (Citation2016) introduced a methodology of developing concept-based rubrics that classifies the responses of diagnostic assessments into concept-based nominal categories. In addition to providing more information regarding student performance than conventional scoring method, this scoring method can also be used in conjunction with existing cognitive diagnosis models, such as MC-DINA (de la Torre, Citation2009), G-DINA (de la Torre, Citation2011), and those models proposed in this special issue (Kuo, Chen, Yang, & Mok, Citation2016)

‘Exploratory and confirmatory factor analyses in reading-related cognitive component among grade four students in Thailand’: Liao, Kuo, Deenang, and Mok (Citation2016) developed a cognitive diagnostic assessment for reading components in Thai language. A three-factor structure solution (i.e. measuring morphological awareness, decoding-related skill and rapid naming skill) was uncovered using exploratory and confirmatory factor analyses. This represents a promising development as assessment from this work can provide rich information for diagnosing dyslexia symptoms in Thai language learning.

‘Cognitive diagnostic models for the test with multiple-choice and constructed-response items’: Traditionally cognitive diagnosis models have applied to classify the status of students’ concepts or skills. Kuo et al. (Citation2016) proposed several models that can also be used to identify students’ misconceptions for the tests with multiple-choice and constructed-response items. Analyses of data from an experiment demonstrated the viability of the proposed methods.

As whole, these papers highlighted but a few of the various research directions that can be taken to enhance the development and use of cognitive diagnostic assessments. To move the field forward, it would be important to be cognisant of the fact that cognitive diagnostic assessments are only useful to the extent that their development and analysis can be synchronised, and their intended use clarified right at the outset. Furthermore, for cognitive diagnostic assessments to be of real value in practical classroom settings, teachers, who work with students on a daily basis, need to be equipped with tools that will allow them to link information from these assessments to their instructional practice. Finally, recent technological developments can and should be harnessed, not only to transform and improve the practice of diagnostic assessment, but also to facilitate a more seamless integration of assessment and instruction.

We would like to thank all the reviewers for their high-quality review and professional advice they gave to the authors. We are confident that the five papers in this issue will enrich our view of cognitive diagnostic assessments, and we hope that readers will be inspired to carry out further research in this area.

Bor-Chen Kuo
National Taichung University of Education, Taiwan
Jimmy de la Torre
Rutgers, The State University of New Jersey, United States

References

  • Arieli-Attali, M., & Liu, Y. (2016). Beyond correctness: Development and validation of concept-based categorical scoring rubrics for diagnostic purposes. Educational Psychology, 36(6), 1096-1119.
  • Chen, H., & Chen, J. (2016). Exploring reading comprehension skill relationships through the G-DINA model. Educational Psychology, 36(6), 1052-1071.
  • Cui, Y., Gierl, M., & Guo, Q. (2016). Statistical classification for cognitive diagnostic assessment: An artificial neural network approach. Educational Psychology, 36(6), 1072-1095.
  • Kuo, B.-C., Chen, C.-H., Yang, C.-W., & Mok, M. M. C. (2016). Cognitive diagnostic models for tests with multiple-choice and constructed-response items. Educational Psychology, 36(6), 1136-1158.
  • Liao, C.-H., Kuo, B.-C., Deenang, E., & Mok, M. M. C. (2016). Exploratory and confirmatory factor analyses in reading-related cognitive component among grade four students in Thailand. Educational Psychology, 36(6), 1120-1135.
  • de la Torre, J. (2009). A cognitive diagnosis model for cognitively based multiple-choice options. Applied Psychological Measurement, 32, 163–183.10.1177/0146621608320523
  • de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76, 179–199.10.1007/s11336-011-9207-7

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.