References
- Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. https://doi.org/https://doi.org/10.1016/0749-5978(91)90020-T
- Azzolini, D., Bazoli, N., Lievore, I., Schizzerotto, A., & Vergolini, L. (2019). Beyond achievement: A comparative look into 15-year-olds’ school engagement, effort, and perseverance in the European Union. European Commission. https://doi.org/https://doi.org/10.2766/98129
- Barry, C. L., Horst, S. J., Finney, S. J., Brown, A. R., & Kopp, J. P. (2010). Do examinees have similar test-taking effort? A high-stakes question for low-stakes testing. International Journal of Testing, 10(4), 342–363. https://doi.org/https://doi.org/10.1080/15305058.2010.508569
- Baumert, J., & Demmrich, A. (2001). Test motivation in the assessment of student skills: The effects of incentives on motivation and performance. European Journal of Psychology of Education, 16(3), 441–462. https://doi.org/https://doi.org/10.1007/BF03173192
- Bolsinova, M., de Boeck, P., & Tijmstra, J. (2017). Modelling conditional dependence between response time and accuracy. Psychometrika, 82(4), 1126–1148. https://doi.org/https://doi.org/10.1007/s11336-016-9537-6
- Caliço, T. (2020, July 13–17). So you think you can process: Ontological and methodological barriers to incorporating event data in psychometric models [Paper presentation]. Virtual International Meeting of the Psychometric Society.
- Debeer, D., Buchholz, J., Hartig, J., & Janssen, R. (2014). Student, school, and country differences in sustained test-taking effort in the 2009 PISA reading assessment. Journal of Educational and Behavioral Statistics, 39(6), 502–523. https://doi.org/https://doi.org/10.3102/1076998614558485
- DeMars, C. E. (2000). Test stakes and item format interactions. Applied Measurement in Education, 13(1), 55–77. https://doi.org/https://doi.org/10.1207/s15324818ame1301_3
- Eklöf, H. (2006). Motivational beliefs in the TIMSS 2003 context: Theory, measurement and relation to test performance [Doctoral Dissertation, Umeå University]. http://www.diva-portal.org/smash/get/diva2:144535/FULLTEXT01.pdfMotivational
- Eklöf, H. (2010, July 1–3). Student motivation and effort in the Swedish TIMSS advanced field study [Paper presentation]. 4th IEA International Research Conference, Gothenburg, Sweden. https://www.iea.nl/sites/default/files/2019-04/IRC2010_Eklof.pdf
- Eklöf, H. (2015). Swedish students’ reported motivation and effort in PISA, over time and in comparison with other countries. In To respond or not to respond: The motivation of Swedish students in taking the PISA test (pp. 11–60). Swedish National Agency for Education.
- Eklöf, H., & Knekta, E. (2017). Using large-scale educational data to test motivation theories: A synthesis of findings from Swedish studies on test-taking motivation. International Journal of Quantitative Research in Education, 4(1–2), 52–71. https://doi.org/https://doi.org/10.1504/IJQRE.2017.086499
- Gathercole, S. E., Pickering, S. J., Knight, C., & Stegmann, Z. (2004). Working memory skills and educational attainment: Evidence from national curriculum assessments at 7 and 14 years of age. Applied Cognitive Psychology, 18(1), 1–16. https://doi.org/https://doi.org/10.1002/acp.934
- Geiser, C. (2013). Data analysis with Mplus. The Guilford Press.
- Gobert, J. D., Baker, R. S., & Wixon, M. B. (2015). Operationalizing and detecting disengagement within online science microworlds. Educational Psychologist, 50(1), 43–57. https://doi.org/https://doi.org/10.1080/00461520.2014.999919
- Goldhammer, F., Martens, T., Christoph, G., & Lüdtke, O. (2016). Test-taking engagement in PIAAC (OECD Education Working Papers, No. 133). OECD Publishing.
- Goldhammer, F., Naumann, J., Rölke, H., Stelter, A., & Tóth, K. (2017). Relating product data to process data from computer-based competency assessment. In D. Leutner, J. Fleischer, J. Grünkorn, & E. Klieme (Eds.), Competence assessment in education: Research, models and instruments (pp. 407–425). Springer.
- Greiff, S., Niepel, C., Scherer, R., & Martin, R. (2016). Understanding students’ performance in a computer-based assessment of complex problem solving: An analysis of behavioral data from computer-generated log files. Computers in Human Behavior, 61, 36–46. https://doi.org/https://doi.org/10.1016/j.chb.2016.02.095
- Greiff, S., Wüstenberg, S., & Avvisati, F. (2015). Computer-generated log-file analyses as a window into students' minds? A showcase study based on the PISA 2012 assessment of problem solving. Computers & Education, 91, 92–105. https://doi.org/https://doi.org/10.1016/j.compedu.2015.10.018
- Hopfenbeck, T. N., & Kjærnsli, M. (2016). Students’ test motivation in PISA: The case of Norway. The Curriculum Journal, 27(3), 406–422. https://doi.org/https://doi.org/10.1080/09585176.2016.1156004
- Kay, J., Maisonneuve, N., Yacef, K., & Zaïane, O. (2006). Mining patterns of events in students’ teamwork data. In Proceedings of the Workshop on Educational Data Mining at the 8th International Conference on Intelligent Tutoring Systems (pp. 45–52). https://www.educationaldatamining.org/ITS2006EDM/Kay_Yacef.pdf
- Knekta, E., & Eklöf, H. (2015). Modeling the test-taking motivation construct through investigation of psychometric properties of an expectancy-value-based questionnaire. Journal of Psychoeducational Assessment, 33(7), 662–673. https://doi.org/https://doi.org/10.1177/0734282914551956
- Lindner, M. A., Lüdtke, O., & Nagy, G. (2019). The onset of rapid-guessing behavior over the course of testing time: A matter of motivation and cognitive resources. Frontiers in Psychology, 10, Article 1533. https://doi.org/https://doi.org/10.3389/fpsyg.2019.01533
- Liu, Y., & Hau, K.-T. (2020). Measuring motivation to take low-stakes large-scale test: New model based on analyses of “participant-own-defined” missingness. Educational and Psychological Measurement, 80(6), 1115–1144. https://doi.org/https://doi.org/10.1177/0013164420911972
- Michaelides, M. P., Brown, G. T. L., Eklöf, H., & Papanastasiou, E. (2019). Motivational profiles in TIMSS mathematics: Exploring student clusters across countries and time. IEA Research for Education and Springer Open.
- Michaelides, M. P., Ivanova, M., & Nicolaou, C. (2020). The relationship between response-time effort and accuracy in PISA science multiple choice items. International Journal of Testing, 20(3), 187–205. https://doi.org/https://doi.org/10.1080/15305058.2019.1706529
- Muthén, L. K., & Muthén, B. O. (2017). Mplus Software (Version 8.3) [Computer software].
- Nagy, G., Lüdtke, O., & Köller, O. (2016). Modeling test context effects in longitudinal achievement data: Examining position effects in the longitudinal German PISA 2012 assessment. Psychological Test and Assessment Modeling, 58(4), 641–670.
- Nagy, G., Nagengast, B., Becker, M., Rose, N., & Frey, A. (2018). Item position effects in a reading comprehension test: An IRT study of individual differences and individual correlates. Psychological Test and Assessment Modeling, 60(2), 165–187.
- Organisation for Economic Co-operation and Development (2009). PISA data analysis manual: SPSS (2nd ed.). https://doi.org/https://doi.org/10.1787/9789264056275-en
- Organisation for Economic Co-operation and Development (2016a). PISA 2015 Assessment and Analytical Framework: Science, reading, mathematic, and financial literacy https://doi.org/10.1787/9789264255425-en
- Organisation for Economic Co-operation and Development (2016b). PISA 2015 results (Volume II): Policies and practices for successful schools. https://doi.org/10.1787/9789264267510-en
- Organisation for Economic Co-operation and Development (2017). PISA 2015 technical report. https://www.oecd.org/pisa/sitedocument/PISA-2015-technical-report-final.pdf
- Paek, P. L. (2008, July 6–13). Some factors contributing to gender differences in the mathematics performance of United States high school students [Paper presentation]. 11th International Congress on Mathematical Education, Monterrey, Mexico.
- Ramalingam, D. (2017). Using data from computer-delivered assessments to improve construct validity and measurement precision [Doctoral dissertation, The University of Melbourne]. https://minerva-access.unimelb.edu.au/handle/11343/197541
- Rutkowski, D., & Wild, J. (2015). Stakes matter: Student motivation and the validity of student assessments for teacher evaluation. Educational Assessment, 20(3), 165–179. https://doi.org/https://doi.org/10.1080/10627197.2015.1059273
- Sahin, F., & Colvin, K. F. (2020). Enhancing response time thresholds with response behaviors for detecting disengaged examinees. Large-scale Assessments in Education, 8(1), Article 5. https://doi.org/https://doi.org/10.1186/s40536-020-00082-1
- Setzer, J. C., Wise, S. L., van den Heuvel, J. R., & Ling, G. (2013). An investigation of examinee test-taking effort on a large-scale assessment. Applied Measurement in Education, 26(1), 34–49. https://doi.org/https://doi.org/10.1080/08957347.2013.739453
- Silm, G., Pedaste, M., & Täht, K. (2020). The relationship between performance and test-taking effort when measured with self-report or time-based instruments: A meta-analytic review Educational Research Review, 31, Article 100335. https://doi.org/https://doi.org/10.1016/j.edurev.2020.100335
- Stenlund, T., Eklöf, H., & Lyrén, P.-E. (2017). Group differences in test-taking behaviour: An example from a high-stakes testing program. Assessment in Education: Principles, Policy & Practice, 24(1), 4–20. https://doi.org/https://doi.org/10.1080/0969594X.2016.1142935
- Ventura, M., & Shute, V. (2013). The validity of a game-based assessment of persistence. Computers in Human Behavior, 29(6), 2568–2572. https://doi.org/https://doi.org/10.1016/j.chb.2013.06.033
- Weirich, S., Hecht, M., Penk, C., Roppelt, A., & Böhme, K. (2017). Item position effects are moderated by changes in test-taking effort. Applied Psychological Measurement, 41(2), 115–129. https://doi.org/https://doi.org/10.1177/0146621616676791
- Wise, S. L. (2006). An investigation of the differential effort received by items on a low-stakes computer-based test. Applied Measurement in Education, 19(2), 95–114. https://doi.org/https://doi.org/10.1207/s15324818ame1902_2
- Wise, S. L. (2009). Strategies for managing the problem of unmotivated examinees in low-stakes testing programs. The Journal of General Education, 58(3), 152–166. https://doi.org/https://doi.org/10.1353/jge.0.0042
- Wise, S. L. (2015). Effort analysis: Individual score validation of achievement test data. Applied Measurement in Education, 28(3), 237–252. https://doi.org/https://doi.org/10.1080/08957347.2015.1042155
- Wise, S. L. (2017). Rapid-guessing behavior: Its identification, interpretation, and implications. Educational Measurement: Issues and Practice, 36(4), 52–61. https://doi.org/https://doi.org/10.1111/emip.12165
- Wise, S. L., & DeMars, C. E. (2005). Low examinee effort in low-stakes assessment: Problems and potential solutions. Educational Assessment, 10(1), 1–17. https://doi.org/https://doi.org/10.1207/s15326977ea1001_1
- Wise, S. L., & Gao, L. (2017). A general approach to measuring test-taking effort on computer-based tests. Applied Measurement in Education, 30(4), 343–354. https://doi.org/https://doi.org/10.1080/08957347.2017.1353992
- Wise, S. L., & Kingsbury, G. G. (2016). Modeling student test-taking motivation in the context of an adaptive achievement test. Journal of Educational Measurement, 53(1), 86–105. https://doi.org/https://doi.org/10.1111/jedm.12102
- Wise, S. L., & Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18(2), 163–183. https://doi.org/https://doi.org/10.1207/s15324818ame1802_2
- Wise, S. L., Pastor, D. A., & Kong, X. J. (2009). Correlates of rapid-guessing behavior in low-stakes testing: Implications for test development and measurement practice. Applied Measurement in Education, 22(2), 185–205. https://doi.org/https://doi.org/10.1080/08957340902754650
- Yavuz, H. C. (2019). The effects of log data on students’ performance. Eğitimde ve Psikolojide Ölçme ve Değerlendirme Dergisi, 10(4), 378–390. https://doi.org/https://doi.org/10.21031/epod.564232
- Zamarro, G., Hitt, C., & Mendez, I. (2019). When students don’t care: Reexamining international differences in achievement and student effort. Journal of Human Capital, 13(4), 519–552. https://doi.org/https://doi.org/10.1086/705799