171
Views
1
CrossRef citations to date
0
Altmetric
Research Article

When Should Individual Ability Estimates Be Reported if Rapid Guessing Is Present?

ORCID Icon

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing (6th ed.). Washington D.C.: American Educational Research Association.
  • American Institutes for Research. (2016). Recommending performance standards for Ohio’s state tests. https://oh.portal.airast.org/core/fileparse.php/3094/urlt/OST_Standard_Setting_Technical_Report_ELA_Math.pdf
  • Coe, R. (1998). Can feedback improve teaching? A review of the social science literature with a view to identifying the conditions under which giving feedback to teachers will result in improved performance. Research Papers in Education, 13(1), 43–66. doi:10.1080/0267152980130104
  • Demars, C. E. (2007). Changes in rapid-guessing behavior over a series of assessments. Educational Assessment, 12(1), 23–45. doi:10.1080/10627190709336946
  • DePascale, C., & Gong, B. (2020). Comparability of individual students’ scores on the “same test.” In A. I. Berman, E. H. Haertel, & J. W. Pellegrino (Eds.), Comparability of large‐scale educational assessments: Issues and recommendations (pp. 25–48). Washington D.C.: National Academy of Education.
  • Deribo, T., Kroehne, U., & Goldhammer, F. (2021). Model‐based treatment of rapid guessing. Journal of Educational Measurement, 58(2), 281–303. doi:10.1111/jedm.12290
  • EdSource. (2016, August 24). California’s smarter balanced assessments: A primer. https://edsource.org/2015/california-smarter-balanced-math-english-results-common-core-faq/86181
  • Foelber, K. J. (2017). Using multiple imputation to mitigate effects of low examinee motivation on estimates of student learning [ Unpublished doctoral dissertation]. James Madison University.
  • Guo, H., & Ercikan, K. (2020). Differential rapid responding across language and cultural groups. Educational Research and Evaluation, 26(5–6), 302–327. doi:10.1080/13803611.2021.1963941
  • Hauser, C., & Kingsbury, G. (2009, April). Individual score validity in a modest-stakes adaptive educational testing setting. Paper presented at the annual meeting of the National Council on Measurement in Education, San Diego, CA.
  • Huang, J. L., Liu, M., & Bowling, N. A. (2015). Insufficient effort responding: Examining an insidious confound in survey data. Journal of Applied Psychology, 100(3), 828–845. doi:10.1037/a0038510
  • Kong, X. J., Wise, S. L., & Bhola, D. S. (2007). Setting the response time threshold parameter to differentiate solution behavior from rapid-guessing behavior. Educational and Psychological Measurement, 67(4), 606–619. doi:10.1177/0013164406294779
  • Lee, Y. H., & Jia, Y. (2014). Using response time to investigate students' test-taking behaviors in a NAEP computer-based study. Large-Scale Assessments in Education, 2(1), 1–24.
  • Lim, H., & Wells, C. S. (2020). Irtplay: An R package for online item calibration, scoring, evaluation of model fit, and useful functions for unidimensional IRT. Applied Psychological Measurement, 44(7–8), 563–565. doi:10.1177/0146621620921247
  • Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17(3), 437–455. doi:10.1037/a0028085
  • Ohio Department of Education. (2018). Understanding Ohio’s state tests score reports: 2017- 2018. https://oh.portal.airast.org/core/fileparse.php/3094/urlt/Understanding_State_Tests_Reports_2017-2018.pdf
  • Osborne, J. W., & Blanchard, M. R. (2011). Random responding from participants is a threat to the validity of social science research results. Frontiers in Psychology, 1, 220. doi:10.3389/fpsyg.2010.00220
  • Pintrich, P. R., & Schunk, D. H. (2002). Motivation in education: Theory, research, and applications. New York, NY: Prentice Hall.
  • Rios, J. A., Liu, O. L., & Bridgeman, B. (2014). Identifying low‐effort examinees on student learning outcomes assessment: A comparison of two approaches. New Directions for Institutional Research, 2014(161), 69–82.
  • Rios, J. A., Guo, H., Mao, L., & Liu, O. L. (2017). Evaluating the impact of careless responding on aggregated-scores: To filter unmotivated examinees or not? International Journal of Testing, 17(1), 74–104. doi:10.1080/15305058.2016.1231193
  • Rios, J. A., & Deng, J. (2021). Does the choice of response time threshold procedure substantially affect inferences concerning the identification and exclusion of rapid guessing responses? A meta-analysis. Large-scale Assessments in Education, 9(1), 1–25. doi:10.1186/s40536-021-00110-8
  • Rios, J. A. (2021a). Is differential noneffortful responding associated with type I error in measurement invariance testing? Educational and Psychological Measurement, 81(5), 957–979. doi:10.1177/0013164421990429
  • Rios, J. A., & Soland, J. (2021a). Parameter estimation accuracy of the effort-moderated item response theory model under multiple assumption violations. Educational and Psychological Measurement, 81(3), 569–594. doi:10.1177/0013164420949896
  • Rios, J. A. (2021b). Assessing the accuracy of parameter estimates in the presence of rapid guessing misclassifications. Educational and Psychological Measurement, 82(1), 122–150. doi:10.1177/00131644211003640
  • Rios, J. A., & Soland, J. (2021b). Investigating the impact of noneffortful responses on individual-level scores: Can the effort-moderated IRT model serve as a solution? Applied Psychological Measurement, 45(6), 391–406. doi:10.1177/01466216211013896
  • Rios, J. A., Deng, J., & Ihlenfeldt, S. (2022). To what degree does rapid guessing underestimate test performance? A meta-analytic investigation. Educational Assessment. Advanced online publication.
  • Rios, J. A. (2022). Assessing the accuracy of parameter estimates in the presence of rapid guessing misclassifications. Educational and Psychological Measurement, 82(1), 122–150.
  • Schnipke, D. L., & Scrams, D. J. (1997). Modeling item response times with a two‐state mixture model: A new method of measuring speededness. Journal of Educational Measurement, 34(3), 213–232. doi:10.1111/j.1745-3984.1997.tb00516.x
  • Schuster, C., & Yuan, K. H. (2011). Robust estimation of latent ability in item response models. Journal of Educational and Behavioral Statistics, 36(6), 720–735. doi:10.3102/1076998610396890
  • Smith, J. K., Given, L. M., Julien, H., Ouellette, D., & DeLong, K. (2013). Information literacy proficiency: Assessing the gap in high school students’ readiness for undergraduate academic work. Library & Information Science Research, 35(2), 88–96. doi:10.1016/j.lisr.2012.12.001
  • Ulitzsch, E., Penk, C., von Davier, M., & Pohl, S. (2021). Model meets reality: Validating a new behavioral measure for test-taking effort. Educational Assessment, 26(2), 104–124. doi:10.1080/10627197.2020.1858786
  • van Barneveld, C. (2007). The effect of examinee motivation on test construction within an IRT framework. Applied Psychological Measurement, 31(1), 31–46. doi:10.1177/0146621606286206
  • Wang, C., & Xu, G. (2015). A mixture hierarchical model for response times and response accuracy. British Journal of Mathematical and Statistical Psychology, 68(3), 456–477. doi:10.1111/bmsp.12054
  • Wise, S. L., & DeMars, C. E. (2005). Low examinee effort in low-stakes assessment: Problems and potential solutions. Educational Assessment, 10(1), 1–17. doi:10.1207/s15326977ea1001_1
  • Wise, S. L., & Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18(2), 163–183. doi:10.1207/s15324818ame1802_2
  • Wise, S. L., & DeMars, C. E. (2006). An application of item response time: The effort‐moderated IRT model. Journal of Educational Measurement, 43(1), 19–38. doi:10.1111/j.1745-3984.2006.00002.x
  • Wise, S. L., Pastor, D. A., & Kong, X. J. (2009). Correlates of rapid-guessing behavior in low-stakes testing: Implications for test development and measurement practice. Applied Measurement in Education, 22(2), 185–205. doi:10.1080/08957340902754650
  • Wise, S. L., & DeMars, C. E. (2010). Examinee noneffort and the validity of program assessment results. Educational Assessment, 15(1), 27–41. doi:10.1080/10627191003673216
  • Wise, S. L., & Smith, L. F. (2011). A model of examinee test taking effort. In J. A. Bovaird, K. F. Geisinger, & C. W. Buckendal Eds., High-stakes testing in education: Science and practice in K-12 settings (pp. 139–153). Washington, D.C: American Psychological Association. doi:10.1037/12330-009
  • Wise, S. L. (2014). The utility of adaptive testing in addressing the problem of unmotivated examinees. Journal of Computerized Adaptive Testing, 2(3), 1–17.
  • Wise, S. L., & Kingsbury, G. G. (2016). Modeling student test‐taking motivation in the context of an adaptive achievement test. Journal of Educational Measurement, 53(1), 86–105. doi:10.1111/jedm.12102
  • Wise, S. L. (2017). Rapid‐guessing behavior: Its identification, interpretation, and implications. Educational Measurement: Issues and Practice, 36(4), 52–61. doi:10.1111/emip.12165
  • Wise, S. L. (2020). The impact of test-taking disengagement on item content representation. Applied Measurement in Education, 33(2), 83–94. doi:10.1080/08957347.2020.1732386
  • Wise, S. L., & Kuhfeld, M. R. (2021). Using retest data to evaluate and improve effort‐moderated scoring. Journal of Educational Measurement, 58(1), 130–149. doi:10.1111/jedm.12275
  • Wright, D. B. (2016). Treating all rapid responses as errors (TARRE) improves estimates of ability (slightly). Psychological Test and Assessment Modeling, 58(1), 15–31.
  • Yildirim-Erbasli, S. N., & Bulut, O. (2020). The impact of students’ test-taking effort on growth estimates in low-stakes educational assessments. Educational Research and Evaluation, 26(7–8), 368–386. doi:10.1080/13803611.2021.1977152

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.