Publication Cover
School Effectiveness and School Improvement
An International Journal of Research, Policy and Practice
Volume 34, 2023 - Issue 2
1,440
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

A best-evidence meta-analysis of the effects of digital monitoring tools for teachers on student achievement

ORCID Icon, & ORCID Icon
Pages 169-188 | Received 24 Jul 2020, Accepted 27 Oct 2022, Published online: 16 Nov 2022

References

  • Balduzzi, S., Rücker, G., & Schwarzer, G. (2019). How to perform a meta-analysis with R: A practical tutorial. Evidence-Based Mental Health, 22(4), 153–160. https://doi.org/10.1136/ebmental-2019-300117
  • Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/0969595980050102
  • Blanc, S., Christman, J. B., Liu, R., Mitchell, C., Travers, E., & Bulkley, K. E. (2010). Learning to learn from data: Benchmarks and instructional communities. Peabody Journal of Education, 85(2), 205–225. https://doi.org/10.1080/01619561003685379
  • *Carlson, D., Borman, G. D., & Robinson, M. (2011). A multistate district-level cluster randomized trial of the impact of data-driven reform on reading and mathematics achievement. Educational Evaluation and Policy Analysis, 33(3), 378–398. https://doi.org/10.3102/0162373711412765
  • Cheung, A. C. K., & Slavin, R. E. (2016). How methodological features affect effect sizes in education. Educational Researcher, 45(5), 283–292. https://doi.org/10.3102/0013189X16656615
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Routledge. https://doi.org/10.4324/9780203771587
  • Cordray, D., Pion, G., Brandt, C., Molefe, A, & Toby, M. (2012). The Impact of the Measures of Academic Progress (MAP) program on student reading achievement (NCEE 2013–4000). National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
  • Desimone, L. M. (2009). Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educational Researcher, 38(3), 181–199. https://doi.org/10.3102/0013189X08331140
  • Faber, J. M., Glas, C. A. W., & Visscher, A. J. (2018). Differentiated instruction in a data-based decision-making context. School Effectiveness and School Improvement, 29(1), 43–63. https://doi.org/10.1080/09243453.2017.1366342
  • *Faber, J. M., Luyten, H., & Visscher, A. J. (2017). The effects of a digital formative assessment tool on mathematics achievement and student motivation: Results of a randomized experiment. Computers & Education, 106, 83–96. https://doi.org/10.1016/j.compedu.2016.12.001
  • *Faber, J. M., & Visscher, A. J. (2018). The effects of a digital formative assessment tool on spelling achievement: Results of a randomized experiment. Computers & Education, 122, 1–8. https://doi.org/10.1016/j.compedu.2018.03.008
  • *Förster, N., & Souvignier, E. (2014). Learning progress assessment and goal setting: Effects on reading achievement, reading motivation and reading self-concept. Learning and Instruction, 32, 91–100. https://doi.org/10.1016/j.learninstruc.2014.02.002
  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487
  • Hellrung, K., & Hartig, J. (2013). Understanding and using feedback – A review of empirical studies concerning feedback from external evaluations to teachers. Educational Research Review, 9, 174–190. https://doi.org/10.1016/j.edurev.2012.09.001
  • Jansen in de Wal, J. (2016). Secondary school teachers’ motivation for professional learning [Doctoral dissertation, Open University of the Netherlands]. https://research.tue.nl/en/publications/secondary-school-teachers-motivation-for-professional-learning
  • Kennedy, A. (2014). Understanding continuing professional development: The need for theory to impact on policy and practice. Professional Development in Education, 40(5), 688–697. https://doi.org/10.1080/19415257.2014.955122
  • Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254–284. https://doi.org/10.1037/0033-2909.119.2.254
  • Koedinger, K. R., McLaughlin, E. A., & Heffernan, N. T. (2010). A quasi-experimental evaluation of an on-line formative assessment and tutoring system. Journal of Educational Computing Research, 43(4), 489–510. https://doi.org/10.2190/EC.43.4.d
  • *Konstantopoulos, S., Miller, S. R., & van der Ploeg, A. (2013). The impact of Indiana’s system of interim assessments on mathematics and reading achievement. Educational Evaluation and Policy Analysis, 35(4), 481–499. https://doi.org/10.3102/0162373713498930
  • *Konstantopoulos, S., Miller, S. R., van der Ploeg, A., & Li, W. (2016). Effects of interim assessments on student achievement: Evidence from a large-scale experiment. Journal of Research on Educational Effectiveness, 9(sup1), 188–208. https://doi.org/10.1080/19345747.2015.1116031
  • Kraft, M. A. (2019). Interpreting effect sizes of education interventions (EdWorkingPaper No. 19-10). Annenberg Institute at Brown University. https://files.eric.ed.gov/fulltext/ED602384.pdf
  • Kreft, I. G., & de Leeuw, J. (2002). Introducing multilevel modeling. SAGE Publications.
  • Lipsey, M. W., & Wilson, D. B. (2000). Applied social research methods: Vol. 49. Practical meta-analysis. SAGE Publications.
  • Locke, E. A., & Latham, G. P. (2002). Building a practically useful theory of goal setting and task motivation: A 35-year odyssey. American Psychologist, 57(9), 705–717. https://doi.org/10.1037//0003-066X.57.9.705
  • Lüdecke, D. (2019). esc: Effect size computation for meta analysis (Version 0.5.1). https://CRAN.R-project.org/package=esc
  • Maas, C. J. M., & Hox, J. J. (2004). The influence of violations of assumptions on multilevel parameter estimates and their standard errors. Computational Statistics & Data Analysis, 46(3), 427–440. https://doi.org/10.1016/j.csda.2003.08.006
  • Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to inform practice. Educational Psychologist, 47(2), 71–85. https://doi.org/10.1080/00461520.2012.667064
  • *May, H., & Robinson, M. A. (2007). A randomized evaluation of Ohio’s Personalized Assessment Reporting System (PARS). Consortium for Policy Research in Education. https://doi.org/10.12698/cpre.2007.pars
  • Nabors Oláh, L., Lawrence, N. R., & Riggan, M. (2010). Learning to learn from benchmark assessment data: How teachers analyze results. Peabody Journal of Education, 85(2), 226–245. https://doi.org/10.1080/01619561003688688
  • *Nunnery, J. A., & Ross, S. M. (2007). The effects of the School Renaissance program on student achievement in reading and mathematics. Research in the Schools, 14(1), 40–59.
  • R Core Team. (2021). R: A language and environment for statistical computing. R Foundation for Statistical Computing. https://www.R-project.org/
  • *Ritzema, E. S. (2015). Professional development in data use: The effects of primary school teacher training on teaching practices and students’ mathematical proficiency [Doctoral dissertation, University of Groningen]. https://research.rug.nl/en/publications/professional-development-in-data-use-the-effects-of-primary-schoo
  • *Roschelle, J., Feng, M., Murphy, R. F., & Mason, C. A. (2016). Online mathematics homework increases student achievement. AERA Open, 2(4). https://doi.org/10.1177/2332858416673968
  • Rücker, G., Schwarzer, G., Carpenter, J. R., Binder, H., & Schumacher, M. (2011). Treatment-effect estimates adjusted for small-study effects via a limit meta-analysis. Biostatistics, 12(1), 122–42. https://doi.org/10.1093/biostatistics/kxq046
  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Houghton Mifflin Company.
  • Sims, S., & Fletcher-Wood, H. (2021). Identifying the characteristics of effective teacher professional development: A critical review. School Effectiveness and School Improvement, 32(1), 47–63. https://doi.org/10.1080/09243453.2020.1772841
  • Slavin, R. E. (2008). Perspectives on evidence-based research in education – What works? Issues in synthesizing educational program evaluations. Educational Researcher, 37(1), 5–14. https://doi.org/10.3102/0013189X08314117
  • *Slavin, R. E., Cheung, A., Holmes, G., Madden, N. A., & Chamberlain, A. (2013). Effects of a data-driven district reform model on state assessment outcomes. American Educational Research Journal, 50(2), 371–396. https://doi.org/10.3102/0002831212466909
  • Staman, L., Timmermans, A. C., & Visscher, A. J. (2017). Effects of a data-based decision making intervention on student achievement. Studies in Educational Evaluation, 55, 58–67. https://doi.org/10.1016/j.stueduc.2017.07.002
  • *van der Scheer, E. A., & Visscher, A. J. (2018). Effects of a data-based decision-making intervention for teachers on students’ mathematical achievement. Journal of Teacher Education, 69(3), 307–320. https://doi.org/10.1177/0022487117704170
  • van Geel, M., Keuning, T., Visscher, A., & Fox, J.-P. (2017). Changes in educators’ data literacy during a data-based decision making intervention. Teaching and Teacher Education, 64, 187–198. https://doi.org/10.1016/j.tate.2017.02.015
  • Vanhoof, J., Verhaeghe, G., Verhaeghe, J. P., Valcke, M., & Van Petegem, P. (2011). The influence of competences and support on school performance feedback use. Educational Studies, 37(2), 141–154. https://doi.org/10.1080/03055698.2010.482771
  • *van Kuijk, M. F., Deunk, M. I., Bosker, R. J., & Ritzema, E. S. (2016). Goals, data use, and instruction: The effect of a teacher professional development program on reading achievement. School Effectiveness and School Improvement, 27(2), 135–156. https://doi.org/10.1080/09243453.2015.1026268
  • Verhaeghe, G., Vanhoof, J., Valcke, M., & Van Petegem, P. (2010). Using school performance feedback: Perceptions of primary school principals. School Effectiveness and School Improvement, 21(2), 167–188. https://doi.org/10.1080/09243450903396005
  • Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36(3), 1–48. https://doi.org/10.18637/jss.v036.i03
  • Visscher, A. J., & Coe, R. (2003). School performance feedback systems: Conceptualisation, analysis, and reflection. School Effectiveness and School Improvement, 14(3), 321–349. https://doi.org/10.1076/sesi.14.3.321.15842
  • Wayman, J. C., Shaw, S., & Cho, V. (2011). Second-year results from an efficacy study of the Acuity data system. https://www.waymandatause.com/wp-content/uploads/2013/11/Wayman_Shaw_and_Cho_Year_2_Acuity_report.pdf
  • Wayman, J. C., Stringfield, S., & Yakimowski, M. (2004). Software enabling school improvement through analysis of student data (CRESPAR Technical Report No. 67). Center for Research on the Education of Students Placed at Risk, Johns Hopkins University. https://files.eric.ed.gov/fulltext/ED489950.pdf