3,606
Views
99
CrossRef citations to date
0
Altmetric
Review Articles

Applying machine learning in science assessment: a systematic review

ORCID Icon, , , &
Pages 111-151 | Received 02 Aug 2019, Accepted 05 Feb 2020, Published online: 18 Mar 2020
 

ABSTRACT

Machine learning (ML) is an emergent computerised technology that relies on algorithms built by ‘learning’ from training data rather than ‘instruction’, which holds great potential to revolutionise science assessment. This study systematically reviewed 49 articles regarding ML-based science assessment through a triangle framework with technical, validity, and pedagogical features on three vertices. We found that a majority of the studies focused on the validity vertex, as compared to the other two vertices. The existing studies primarily involve text recognition, classification, and scoring with an emphasis on constructing scientific explanations, with a vast range of human-machine agreement measures. To achieve the agreement measures, most of the studies employed a cross-validation method, rather than self- or split-validation. ML allows complex assessments to be used by teachers without the burden of human scoring, saving both time and cost. Most studies used supervised ML, which relies on extraction of attributes from student work that was first coded by humans to achieve automaticity, rather than semi- or unsupervised ML. We found that 24 studies were explicitly embedded in science learning activities, such as scientific inquiry and argumentation, to provide feedback or learning guidance. This study identifies existing research gaps and suggests that all three vertices of the ML triangle should be addressed in future assessment studies, with an emphasis on the pedagogy and technology features.

Disclosure Statement

No potential conflict of interest was reported by the authors.

Notes

1. We originally found 47 papers including 34 paper from literature search, 8 papers from citation search and 5 solicited papers from authors. We then solicited one more paper from authors after completing the first-round analysis, which yielded one more paper from citation. In sum, there are 49 papers.

Additional information

Funding

The work was supported by the National Science Foundation under Grant No. DUE 1323162 and the faculty research fund from the College of Education, University of Illinois at Chicago.

Notes on contributors

Xiaoming Zhai

Xiaoming Zhai is an Assistant Professor in Science Education. He is interested in developing and applying innovative assessment for science teaching and learning.

Yue Yin

Yue Yin is an Associate Professor in Educational Psychology whose research interests and expertise including assessments, science education, research design, survey design, and applied measurement/statistics.

James W. Pellegrino

James W. Pellegrino, Liberal Arts and Sciences Distinguished Professor, and Distinguished Professor of Education at University of Illinois at Chicago. He is interested in cognition, instruction and assessment. He is an AERA fellow, a lifetime National Associate of the National Academy of Sciences and the National Academy of Education.

Kevin C. Haudek

Kevin C. Haudek is an Assistant Professor. He is interested in the application of computerised tools to evaluate student writing, focusing on short, content-rich responses in science education.

Lehong Shi

Lehong Shi is an Educator and Researcher in education.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.