ABSTRACT
This study provides a partial-credit scoring (PCS) approach to awarding students’ performance on multiple-choice items in science education. The approach is built on fundamental ideas, the critical pieces of students’ understanding and knowledge to solve science problems. We link each option of the items to several specific fundamental ideas to capture their mastery patterns when an option is selected. Using these mastery patterns to order the options of items and accordingly assign credits to students, we measure students’ cognitive proficiency without including additional measures (e.g. times of trial) to the test or requiring extra support (e.g. technology). Using many-facet Rasch analysis, we find that the ordered options students selected were aligned with their ability measures. The PCS yields robust psychometric quality; compared to the dichotomous scoring of multiple-choice items, it generates better item fit and separation parameters. Besides, this PCS approach helps address construct validity by modelling student responses at the option level to reflect students’ mastery of fundamental ideas.
Acknowledgments
The authors are grateful for the other group members Drs. Maria Araceli Ruiz-Primo and Jim Minstrel, and graduate students Dongsheng Dong, Philip Hernandez, and Klint Kanopka.The authors appreciate Lehong Shi who proofread and edited this manuscript multiple times.
Disclosure statement
No potential conflict of interest was reported by the author(s).