285
Views
0
CrossRef citations to date
0
Altmetric
Articles

Exploring task features that predict psychometric quality of test items: the case for the Dutch driving theory exam

ORCID Icon, ORCID Icon &
Pages 80-104 | Received 12 Nov 2020, Accepted 19 Feb 2021, Published online: 27 May 2021

References

  • Abedi, J., & Lord, C. (2001). The language factor in mathematics tests. Applied Measurement in Education, 14, 219–234.
  • Almond, R. G. , Kim, Y. J., Velasquez, G., & Shute, V. J. (2014). How task features impact evidence from assessments embedded in simulations and games. Measurement: Interdisciplinary Research & Perspectives, 12, 1–33.
  • Arendasy, M. & Sommer, M. (2012) Using automatic item generation to meet the increasing demands of high-stakes assessment. Learning and Individual Differences, 22, 112–117.
  • Beddow, P. A., Elliot, S. N., & Kettler, R. J. (2010). Test Accessibility and Modification Inventory (TAMI): Accessibility Rating Matrix. Vanderbilt University. https://peabody.vanderbilt.edu/docs/pdf/PRO/TAMI_Technical_Manual.pdf
  • Beddow, P. A., Kurz, A., & Frey, J. R. (2011). Accessibility theory: Guiding the science and practice of test item design with the test-taker in mind. In S. N. Elliott, R.J. Kettler, P.A. Beddow, & A. Kurz (Eds.), Handbook of accessible achievement tests for all students. Bridging the gaps between research, practice, and policy. Springer
  • Bejar, I. I. (1993). A generative approach to psychological and educational measurement. In N. Frederiksen, R. J. Mislevy, & I.I. Bejar (Eds.), Test theory for a new generation of tests (pp. 323–359). Lawrence Erlbaum Associates.
  • Bejar, I. I. (2002). Generative testing: From conception to implementation. In S. H. Irvine & P. C. Kyllonen (Eds.), Item generation for test development (pp. 199–217). Lawrence Erlbaum.
  • Bennett, R. E. (1999). Using new technology to improve assessment. Educational Meassurement: Issues and Practice, 18, 5–12.
  • Birnbaum, A. (1968). Some latent trait models and their use in inferring an examinee’s ability. In F. M. Lord and M. R. Novick (Eds.), Statistical theories of mental test scores (pp. 397–479). Addison-Wesley.
  • CBR. (2019). Jaarverslag [Annual report]. https://www.cbr.nl/nl/over-het-cbr/over/cbr-in-cijfers/jaarverslag.htm
  • Chalmers, R. P. (2012). mirt: A multidimensional item response theory package for the R-environment. Journal of Statistical Software, 48, 1–29. https://CRAN.R-project.org/package=mirt https://doi.org/10.18637/jss.v048.i06
  • Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). Lawrence Erlbaum.
  • Cummins, D. D., Kintsch, W., Reusser, K., & Weimer, R. (1988). The role of understanding in solving word problems. Cognitive Psychology, 20, 405–438 https://doi.org/10.1016/0010-0285(88)90011-4
  • Daas, R. J. M., Dijkstra, A. B., Roelofs, E., & Sluijter, C. (2020). Het meten van skills: Verkenning van mogelijkheden voor een skills module gekoppeld aan het Nationaal Cohortonderzoek Onderwijs [Measuring skills: Exploring possibilities for a skills module linked to the National Cohort Survey Education]. Amsterdam University Press.
  • De Boeck, P., & Wilson, M. (Eds). (2004). Explanatory item response theory models. A generalized linear and nonlinear approach. Springer.
  • Elliot, S. N., Kurtz, A., Beddow, P. A., Frey, J. (2009). Cognitive load theory: Instruction-based research with applications for designing tests [Paper presentation]. National Association of School Psychologists’ Annual Convention. Boston MA.
  • Endsley, M.R. (1996). Toward a theory of situation awareness in dynamic systems. Human Factors Journal, 37, 32–64. https://doi.org/10.1518/001872095779049543
  • Fischer, G. H. (1973). The linear logistic test model as an instrument in educational research. Acta Psychologica, 37, 359–374. https://doi.org/10.1016/0001-6918(73)90003-6
  • Fischer, G. H., & Molenaar, I. W. (1995). Rasch models: Foundations, recent developments, and applications. Springer-Verlag.
  • Fox, J-P. (2010). Bayesian item response modeling. Springer-Verlag.
  • Gierl, M. J. & Haladyna, T. M. (2013). Automatic item generation: Theory and practice (pp. 136–156). Routledge.
  • Gierl, M., Lai, H., & Turner, S. R. (2012). Using automatic item generation to create multiple-choice test items. Medical Education, 46, 757–765.
  • Gittler, G. (1984). Development and testing of a new test instrument for measuring spatial imagination. Journal for Differential and Diagnostic Psychology, 5, 141–165.
  • Gorin, J.S., & Embretson, S.E. (2013). Using cognitive psychology to generate items and predict item characteristics. In M. J. Gierl & T. M. Haladyna (Eds.), Automatic item generation: Theory and practice (pp. 136–156). Routledge, Taylor and Francis.
  • Haladyna, T.M., Downing, S.M., & Rodriguez, M.C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15, 309–334.
  • Hambleton, R. K., & Swaminathan, H. (1985). Item response theory: Principles and applications. Springer Science and Business Media.
  • lrvine, S. H. (2002). The foundations of item generation for mass testing. In S.H. Irvine and P. C. Kyllonen (Eds.), Item generation for test development (pp. 3–34). Lawrence Erlbaum Associates.
  • Janssen, R., Schepers, J., Peres, D. (2004). Models with item and item-group predictors. In De Boeck, P., & Wilson, M. (Eds). Explanatory item response theory models. A generalized linear and nonlinear approach (pp. 189–212). Springer.
  • Johnson-Laird, P. N. (1983). Mental models: Towards a cognitive science of language, inference, and consciousness. Cambridge University Press.
  • Johnson-Laird, P. N. (2005). The history of mental models. In K. I. Manktelow, & M. C. Cheung (Eds.), Psychology of reasoning: Theoretical and historical perspectives. Psychology Press.
  • Kettler, R. J., Elliott, S. N., & Beddow, P. A (2009). Modifying achievement test items: A theory-guided and data-based approach for better measurement of what students with disabilities know. Peabody Journal of Education, 84, 529–551. https://doi.org/10.1080/01619560903240996
  • Kim, Y. J., Almond, R. G., & Shute, V.J. (2016). Applying evidence-centered design for the development of game-based assessments in physics playground. International Journal of Testing, 16, 142–163.
  • LeBeau, B., McVay, A. (2017). Validity of the three parameter item response theory model for field test data (ITP Research Series). University of Iowa.
  • Maris, G., & Bechger, T., Koops, J., Partchev, I. (2020). Dexter (1.0.6). [R-package].
  • Mayer, R. E. (2008). Applying the science of learning: Evidence based principles for the design of multimedia instruction. American Psychologist, 63, 760–769. https://doi.org/10.1037/0003-066X.63.8.760
  • Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2002). On the roles of task model variables in assessment design. In S. H. Irvine & P. C. Kyllonen (Eds.), Item generation for test development. Lawrence Erlbaum.
  • Oliveri, M. E., Lawless, R., & Mislevy, R. J. (2019). Using evidence-centered design to support the development of culturally and linguistically sensitive collaborative problem-solving assessments. International Journal of Testing, 19, 270–300, https://doi.org/10.1080/15305058.2018.1543308
  • Paas, F. G. W. C., & Van Merriënboer, J. J. G. (1993). The efficiency of instructional conditions: An approach to combine mental effort and performance measures. Human Factors, 35, 737–743.
  • Riconscente, M. M., Mislevy, R. J. & Corrigan, S., (2015). Evidence-centered design. In S. Lane, M. R. Raymond, & T. M. Haladyna (Eds.), Handbook of test development. Routledge.
  • Roelofs, E. C. (2019). A framework for improving the accessibility of assessment tasks. In B. P. Veldkamp & C. Sluijter (Eds). Theoretical and practical advances in computer based educational measurement (pp. 21–45). Springer.
  • Roelofs, E. C., Bolsinova, M., Verschoor, A., & Vissers, J. A. M. M. (2015). Development and evaluation of a competence-based exam for prospective driving instructors. Journal of Traffic and Transportation Engineering, 3, 147–157.
  • Schnotz, W. & Kürschner, C. (2007). A reconsideration of cognitive load theory. Educational Psychology Review, 19, 469–508.
  • Sfard, A. (1991). On the dual nature of mathematical conceptions: Reflections on processes and objects as different sides of the same coin. Educational Studies in Mathematics, 22, 1–36. https://doi.org/10.1007/BF00302715
  • Sinharray, S., & Johnson, M. (2013). Statistical modeling of automatically generated items. In M. Gierl & T. M. Haladyna (Eds.), Automatic item generation: Theory and practice. Rouledge.
  • Sonnleitner, P. (2008). Using the LLTM to evaluate an item-generating system for reading comprehension. Psychology Science Quarterly, 50(3), 345–362.
  • Sweller, J. (1994). Cognitive load theory, learning difficulty, and instructional design. Learning and Instruction, 4, 295–312. https://doi.org/10.1016/0959-4752(94)90003-5
  • Sweller, J. (2010). Element interactivity and intrinsic, extraneous and germane cognitive load. Educational Psychology Review, 22, 123–138. https://doi.org/10.1007/s10648-010-9128-5
  • Sweller, J., van Merriënboer, J. J. G., & Paas, F. G. W. C. (2019). Cognitive architecture and instructional design: 20 years later. Educational Psychology Review, 31, 261–292.
  • Van der Linden, W. J. (2016). Handbook of item response theory. Chapman & Hall.
  • Von Davier, M. & Lee, Y-S (2019). Handbook of diagnostic classification models. Springer.
  • Zwitser, R., Roelofs, E., & Béguin, A. (2014). Report of the research into the psychometric quality of the current CBR theory exams. Cito.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.