ABSTRACT
We developed and evaluated an Oceanography Concept Inventory (OCI), which used a mixed-methods approach to test student achievement of 11 learning goals for an introductory-level oceanography course. The OCI was designed with expert input, grounded in research on student (mis)conceptions, written with minimal jargon, tested on 464 students, and evaluated for validity, reliability, and generalizability. The result was a valid and reliable, semicustomizable instrument, with longer 23-item and shorter 16-item versions, as well as flexible grading using either classical one-point-per-item scoring or item-difficulty–weighted scoring. This article is of utility to potential end users of the OCI (e.g., practitioners and researchers) and test developers considering constructing a concept inventory.
Acknowledgments
We would like to thank Derek Briggs, Katherine Perkins, and Wendy Smith for their guidance and advice on test construction; the instructors who permitted the administration of this survey in their classes; the students who participated in the OCI exercises and surveys; and the experts and novices who provided regular feedback on individual items as they were developed.
Notes
4 The Supplemental File 1: Quantitative Methods Used to Evaluate the OCI is available online at http://dx.doi.org/10.5408/14-061s1.