523
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Remote monitoring of implementation fidelity using log-file data from multiple online learning platforms

ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Received 19 May 2023, Accepted 02 Jan 2024, Published online: 22 Jan 2024

References

  • Adair, J. G., Sharpe, D., & Huynh, C. L. (1989). Hawthorne Control Procedures in Educational Experiments: A Reconsideration of Their Use and Effectiveness. Review of Educational Research, 59(2), 215–228. URL: Publisher: American Educational Research Association. https://doi.org/10.3102/00346543059002215
  • Al-Ubaydli, O., List, J. A., & Suskind, D. (2019). The science of using science: Towards an understanding of the threats to scaling experiments. National Bureau of Economic Research, 1–33. https://doi.org/10.3386/w25848
  • Bos, S. E., Powell, S. R., Maddox, S. A., & Doabler, C. T. (2023). A synthesis of the conceptualization and measurement of implementation fidelity in mathematics intervention research. Journal of Learning Disabilities, 56(2), 95–115. https://doi.org/10.1177/00222194211065498
  • Botelho, A. F., Varatharaj, A., Patikorn, T., Doherty, D., Adjei, S. A., & Beck, J. E. (2019). Developing early detectors of student attrition and wheel spinning using deep learning. IEEE Transactions on Learning Technologies, 12(2), 158–170. https://doi.org/10.1109/TLT.2019.2912162
  • Breitenstein, S. M., Gross, D., Garvey, C. A., Hill, C., Fogg, L., & Resnick, B. (2010). Implementation fidelity in community-based interventions. Research in Nursing & Health, 33(2), 164–173. https://doi.org/10.1002/nur.20373
  • Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balain, S. (2007). A conceptual framework for implementation fidelity. Implementation Science: IS, 2(1), 40. https://doi.org/10.1186/1748-5908-2-40
  • Clow, D. (2012). The learning analytics cycle: Closing the loop effectively [Paper presentation]. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 134–138). https://doi.org/10.1145/2330601.2330636
  • Cook, D. L. (1962). The Hawthorne effect in educational research. The Phi Delta Kappan, 44, 116–122. https://www.jstor.org/stable/ 20342865
  • Crawford, L., Carpenter, D. M., Wilson, M. T., Schmeister, M., & McDonald, M. (2012). Testing the relation between fidelity of implementation and student outcomes in math. Assessment for Effective Intervention, 37(4), 224–235. https://doi.org/10.1177/1534508411436111
  • Crooks, R. (2017). Representationalism at work: Dashboards and data analytics in urban education. Educational Media International, 54(4), 289–303. https://doi.org/10.1080/09523987.2017.1408267
  • Cross, W., West, J., Wyman, P. A., Schmeelk-Cone, K., Xia, Y., Tu, X., Teisl, M., Brown, C. H., & Forgatch, M. (2015). Observational measures of implementer fidelity for a school-based preventive intervention: Development, reliability and validity. Prevention Science: The Official Journal of the Society for Prevention Research, 16(1), 122–132. https://doi.org/10.1007/s11121-014-0488-9
  • Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18(1), 23–45. https://doi.org/10.1016/S0272-7358(97)00043-3
  • Dart, E. H., Cook, C. R., Collins, T. A., Gresham, F. M., & Chenier, J. S. (2012). Test driving interventions to increase treatment integrity and student outcomes. School Psychology Review, 41(4), 467–481. https://doi.org/10.1080/02796015.2012.12087500
  • Decker-Woodrow, L. E., Mason, C. A., Lee, J.-E., Chan, J. Y.-C., Sales, A., Liu, A., & Tu, S. (2023). The impacts of three educational technologies on algebraic understanding in the context of COVID-19. AERA Open, 9, 23328584231165919. https://doi.org/10.1177/23328584231165919
  • Dieter, K. C., Studwell, J., & Vanacore, K. P. (2020). Differential responses to personalized learning recommendations revealed by event-related analysis [Paper presentation]. International Conference on Educational Data Mining (EDM). https://eric.ed.gov/?id=ED607826
  • Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3-4), 327–350. https://doi.org/10.1007/s10464-008-9165-0
  • Finch, W. H.,Bolin, J. E., &Kelley, K. (2019). Multilevel modeling using R (2nd ed.). CRC Press. https://doi.org/10.1201/9781351062268
  • Fox, N. S., Brennan, J. S., & Chasen, S. T. (2008). Clinical estimation of fetal weight and the Hawthorne effect. European Journal of Obstetrics, Gynecology, and Reproductive Biology, 141(2), 111–114. https://doi.org/10.1016/j.ejogrb.2008.07.023
  • Gleason, A. M. (2021). Remote monitoring of a work-from-home employee to identify stress: A case report. Workplace Health & Safety, 69(9), 419–422. https://doi.org/10.1177/2165079921997322
  • Gresham, F. M., Dart, E. H., & Collins, T. A. (2017). Generalizability of multiple measures of treatment integrity: Comparisons among direct observation, permanent products, and self-report. School Psychology Review, 46(1), 108–121. https://doi.org/10.1080/02796015.2017.12087606
  • Gupta, S., Supplee, L. H., Suskind, D., & A, L. J. (2021). Failed to scale: Embracing the challenge of scaling in early childhood. In Scale-up effect in early childhood and public policy: Why interventions lose essay. In Scale-up effect in early childhood and public policy pp. (1–21). Routledge.
  • Gurung, A., Botelho, A. F., & Heffernan, N. T. (2021). Examining student effort on help through response time decomposition [Paper presentation]. LAK21: 11th International Learning Analytics and Knowledge Conference, ACM, Irvine CA USA (pp. 292–301). https://doi.org/10.1145/3448139.3448167
  • Helsabeck, N. P., Justice, L. M., & Logan, J. A. R. (2022). Assessing fidelity of implementation to a technology-mediated early intervention using process data. Journal of Computer Assisted Learning, 38(2), 409–421. https://doi.org/10.1111/jcal.12621
  • Hrastinski, S. (2019). What Do We Mean by Blended Learning? TechTrends, 63(5), 564–569. https://doi.org/10.1007/s11528-019-00375-5.
  • Hurwitz, L.B., 2019. Getting a read on ready to learn media: A meta-analytic review of effects on literacy. Child Development 90, 1754–1771. https://doi.org/10.1007/s11528-019-00375-5
  • Hurwitz, L. B., & Macaruso, P. (2021). Supporting struggling middle school readers: Impact of the Lexia® PowerUp Literacy® program. Journal of Applied Developmental Psychology, 77(1), 101329. https://doi.org/10.1016/j.appdev.2021.101329
  • Kaur, B. (2015). What matters? From a small scale to a school-wide intervention. ZDM, 47(1), 105–116. https://doi.org/10.1007/s11858-014-0645-4
  • Kulikowski, K., Przytuła, S., & Sułkowski, L. (2023). When publication metrics become the fetish: The research evaluation systems’ relationship with academic work engagement and burnout. Research Evaluation, 32(1), 4–18. https://doi.org/10.1093/reseval/rvac032
  • Kulo, V. A., & Cates, W. M. (2013). The heart of implementation fidelity: Instructional design and intent. Prevention Science: The Official Journal of the Society for Prevention Research, 53(3), 215–229. http://www.jstor.org/stable/44430114.
  • Lee, J. E.,Ottmar, E.,Chan, J. Y. C.,Decker-Woodrow, L., &Booker, B. (Under-Review). In-person vs. virtual: Learning modality selections and movement during COVID-19 and their influence on student learning.
  • Lee, C. Y. S., August, G .J., Realmuto, G. M., Horowitz, J. L., Bloomquist, M. L., Klimes-Dougan, B. (2008). Fidelity at a distance: Assessing implementation fidelity of the early risers prevention program in a going-to-scale intervention trial. Prevention Science, 9(3), 215–229. https://doi.org/10.1007/s11121-008-0097-6
  • List, J. A., Suskind, D., Supplee, L. H., Macaruso, P., Marshall, V., & Hurwitz, L. B. (2021). The scale-up effect in early childhood and public policy: Why interventions lose impact at scale and what we can do about it. Routledge. https://doi.org/10.4324/9780367822972.
  • Macaruso, P., Marshall, V., & Hurwitz, L. B. (2019). Longitudinal blended learning in a low SES elementary school. In Proceedings of Global Learn 2019-Global Conference on Learning and Technology (pp. 253–262). https://www.learntechlib.org/primary/p/210313/
  • McCambridge, J., Witton, J., & Elbourne, D. R. (2014). Systematic review of the Hawthorne effect: New concepts are needed to study research participation effects. Journal of Clinical Epidemiology, 67(3), 267–277. https://doi.org/10.1016/j.jclinepi.2013.08.015
  • McMurray, S. (2013). An evaluation of the use of Lexia Reading software with children in Year 3, Northern Ireland (6- to 7-year olds). Journal of Research in Special Educational Needs, 13(1), 15–25. https://doi.org/10.1111/j.1471-3802.2012.01238.x
  • Ocumpaugh, J., Baker, R., Gowda, S., Heffernan, N., & Heffernan, C. (2014). Population validity for educational data mining models: A case study in affect detection. British Journal of Educational Technology, 45(3), 487–501. https://doi.org/10.1111/bjet.12156
  • Ottmar, E., Lee, J.-E., Vanacore, K., Pradhan, S., Decker-Woodrow, L., & Mason, C. A. (2023). Data from the efficacy study of from here to there! A dynamic technology for improving algebraic understanding. Journal of Open Psychology Data, 11(1), 5. https://doi.org/10.5334/jopd.87
  • Paquette, L., & Baker, R.S. (2019). Comparing machine learning to knowledge engineering for student behavior modeling: A case study in gaming the system. Interactive Learning Environments 27(5-6), 585–597. https://doi.org/10.1080/10494820.2019.1610450.
  • Pishghadam, R., Adamson, B., Sadafian, S.S., Kan F.L.F., (2014) Conceptions of assessment and teacher burnout, assessment in Education: Principles, Policy & Practice, 21:1, 34–51, https://doi.org/10.1080/0969594X.2013.817382
  • Popham, W. J. (2001). Teaching to the test? Educational Leadership, 58(6), 16–21. https://olms.ctejhu.org/data/ck/file/TeachingtotheTest-Popham.pdf
  • Rubin, D. B. (1976). Inference and missing data. Biometrika, 63(3), 581–592. https://doi.org/10.1093/biomet/63.3.581
  • Rutherford, T., Farkas, G., Duncan, G., Burchinal, M., Kibrick, M., Graham, J., Richland, L., Tran, N., Schneider, S., Duran, L., & Martinez, M. E. (2014). A randomized trial of an elementary school mathematics software intervention: Spatial-temporal math. Journal of Research on Educational Effectiveness, 7(4), 358–383. https://doi.org/10.1080/19345747.2013.856978
  • Schechter, R. L., Kazakoff, E. R., Bundschuh, K., Prescott, J. E., & Macaruso, P. (2017). Exploring the impact of engaged teachers on implementation fidelity and reading skill gains in a blended learning reading program. Reading Psychology, 38(6), 553–579. https://doi.org/10.1080/02702711.2017.1306602
  • Schoenwald, S. K., Garland, A. F., Chapman, J. E., Frazier, S. L., Sheidow, A. J., & Southam-Gerow, M. A. (2011). Toward the effective and efficient measurement of implementation fidelity. Administration and Policy in Mental Health, 38(1), 32–43. https://doi.org/10.1007/s10488-010-0321-0
  • Scrucca, L., Fop, M., Murphy, T. B., & Raftery, A. E. (2016). mclust 5: Clustering, classification and density estimation using Gaussian finite mixture models. The R Journal, 8(1), 289–317. https://doi.org/10.32614/RJ-2016-021
  • Shamir, H., Pocklington, D., Feehan, K., & Yoder, E, Waterford Research Institute, Sandy, Utah, USA (2019). Evidence for dosage and long-term effects of computer-assisted instruction. International Journal of Learning and Teaching, 5(3), 220–226. https://doi.org/10.18178/ijlt.5.3.220-226
  • Smith, J. D., Schneider, B. H., Smith, P. K., & Ananiadou, K. (2004). The effectiveness of whole-school antibullying programs: A synthesis of evaluation research. School Psychology Review, 33(4), 547–560. https://doi.org/10.1080/02796015.2004.12086267
  • Spector, J. M. (2013). Emerging educational technologies and research directions. Journal of Educational Technology & Society 16(4),537–567. https://www.jstor.org/stable/jeductechsoci.16.2.21.
  • Tobler, N.S. (1986). Meta-analysis of 143 adolescent drug prevention programs: Quantitative outcome results of program participants compared to a control or comparison group. Journal of Drug Issues 16, 537–567. https://doi.org/10.1177/002204268601600405
  • Vanacore, K., Dieter, K., Hurwitz, L., & Studwell, J. (2021). Longitudinal Clusters of Online Educator Portal Access: Connecting Educator Behavior to Student Outcomes [Paper presentation]. LAK21: 11th International Learning Analytics and Knowledge Conference, 540–545. https://doi.org/10.1145/3448139.3448195
  • Wang, H., Woodworth, K. (2011). A randomized controlled trial of two online mathematics curricula. Society for Research on Educational Effectiveness. https://eric.ed.gov/?id=ED528686
  • Wardenaar, K. J. (2021). Latent Profile Analysis in R: A tutorial and comparison to Mplus. PsyArXiv. https://osf.io/wzftr, https://doi.org/10.31234/osf.io/wzftr
  • Wise, A. F., Knight, S., & Ochoa, X. (2021). What makes learning analytics research matter. Journal of Learning Analytics, 8(3), 1–9. https://doi.org/10.18608/jla.2021.7647
  • Wistrom, C. A. (2017). Perceptions of school leaders regarding the benefits of leadership dashboards (Order No. 10812880). Available from ProQuest One Academic; Publicly Available Content Database. http://ezproxy.wpi.edu/login?url=https://www.proquest.com/dissertations-theses/perceptions-school-leaders-regarding-benefits/docview/2039026967/se-2
  • Wolgemuth, J. R., Abrami, P. C., Helmer, J., Savage, R., Harper, H., & Lea, T. (2014). Examining the impact of ABRACADABRA on early literacy in Northern Australia: An implementation fidelity analysis. The Journal of Educational Research, 107(4), 299–311. https://doi.org/10.1080/00220671.2013.823369
  • Wu, K. S., Lee, S. S. J., Chen, J. K., Chen, Y. S., Tsai, H. C., Chen, Y. J., Huang, Y. H., & Lin, H. S. (2018). Identifying heterogeneity in the Hawthorne effect on hand hygiene observation: A cohort study of overtly and covertly observed results. BMC Infectious Diseases, 18(1), 1–8. https://doi.org/10.1186/s12879-018-3292-5
  • Wurpts, I. C., & Geiser, C. (2014). Is adding more indicators to a latent class analysis beneficial or detrimental? Results of a Monte-Carlo study. Frontiers in Psychology, 5, 920. https://doi.org/10.3389/fpsyg.2014.00920