1,128
Views
12
CrossRef citations to date
0
Altmetric
Articles

Toward a New Approach to the Evaluation of a Digital Curriculum Using Learning Analytics

Pages 89-104 | Received 08 Nov 2013, Accepted 05 Dec 2013, Published online: 24 Mar 2015

References

  • Andergassen, M., Mödritscher, F., & Neumann, G. (2014). Practice and repetition during exam preparation in blended learning courses: Correlations with learning results. Journal of Learning Analytics, 1(1), 48–74.
  • Baker, R. S., Corbett, A. T., Koedinger, K. R., & Wagner, A. Z. (2004). Off-task behavior in the cognitive tutor classroom: When students game the system. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 383–390). New York, NY: ACM.
  • Berman, P., & McLaughlin, M. W. (1976, March). Implementation of educational innovation. The Educational Forum, 40(3), 345–370. http://dx.doi.org/10.1080/00131727609336469
  • Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. Washington, DC: SRI International.
  • Bybee, R. W., Taylor, J. A., Gardner, A., Van Scotter, P., Powell, J. C., Westbrook, A., & Landes, N. (2006). The BSCS 5E instructional model: Origins and effectiveness. Colorado Springs, CO: BSCS.
  • Century, J., Rudnick, M., & Freeman, C. (2010). A framework for measuring fidelity of implementation: A foundation for shared language and accumulation of knowledge. American Journal of Evaluation, 31(2), 199–218. http://dx.doi.org/10.1177/1098214010366173
  • Century, J., Cassata, A., Rudnick, M., & Freeman, C. (2012). Measuring enactment of innovations and the factors that affect implementation and sustainability: Moving toward common language and shared conceptual understanding. The Journal of Behavioral Services and Research, 39(4), 343–361. http://dx.doi.org/10.1007/s11414-012-9287-x
  • Chumley-Jones, H. S., Dobbie, A., & Alford, C. L. (2002). Web-based learning: Sound educational method or hype? A review of the evaluation literature. Academic Medicine, 77(10), S86–S93. http://dx.doi.org/10.1097/00001888-200210001-00028
  • Cocea, M., Hershkovitz, A., & Baker, R. (2009). The impact of off-task and gaming behaviors on learning: Immediate or aggregate? In V. Dimitrova, R. Mizoguchi, B. Du Boulay, & A. Graesser (Eds.), Proceedings of the 2009 Conference on Artificial Intelligence in Education: Building learning systems that care: From knowledge representation to affective modelling. Frontiers in artificial intelligence and applications (pp. 507–514). Amsterdam, the Netherlands: IOS Press.
  • Corbi, A., & Burgos, D. (2014). Review of current student-monitoring techniques used in elearning-focused recommender systems and learning analytics. The Experience API & LIME model case study. International Journal of Artificial Intelligence and Interactive Multimedia, 2(7), 44–52. http://dx.doi.org/10.9781/ijimai.2014.276
  • Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18, 23–45. http://dx.doi.org/10.1016/S0272-7358(97)00043-3
  • Dietz-Uhler, B., & Hurn, J. (2013). Using learning analytics to predict (and improve) student success: A faculty perspective. Journal of Interactive Online Learning, 12(1), 17–26.
  • D’Mello, S. K., Lehman, B., & Graesser, A.C. (2011). A motivationally supportive affect-sensitive autotutor. In R.A. Calvo & S.K. D’Mello (Eds.), New perspectives on affect and learning technologies, (pp.113–126). New York, NY: Springer.
  • D'Mello, S., Olney, A., & Person, N. (2010). Mining collaborative patterns in tutorial dialogues. Journal of Educational Data Mining, 2(1), 2–37.
  • Doan, T. A., Zhang, J., Tjhi, W. C., & Lee, B. S. (2011). Analyzing students’ usage of e-learning systems in the cloud for course management. In F.-Y. Yu, T. Hirashima, T. Supnithi, & G. Biswas (Eds.), Proceedings of the 19th International Conference on Computers in Education: ICCE 2011 (pp. 297–301). Bangkok, Thailand: National Electronics and Computer Technology Center.
  • Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3-4), 327–350. http://dx.doi.org/10.1007/s10464-008-9165-0
  • Enders, C. K., & Tofighi, D. (2007). Centering predictor variables in cross-sectional multilevel models: A new look at an old issue. Psychological Methods, 12, 121–138. http://dx.doi.org/10.1037/1082-989X.12.2.121
  • Franks, R. P. & Schroeder, J. (2013). Implementation science: What do we know and where do we go from here? In T. Halle, A. Metz, & I. Martinez-Beck (Eds.), Applying implementation science in early childhood programs and systems, (pp.5–20). Baltimore, MD: Brooks.
  • Fullan, M., & Pomfret, A. (1977). Research on curriculum and instruction implementation. Review of Educational Research, 47, 335–397. http://dx.doi.org/10.3102/00346543047002335
  • Hagermoser Sanetti, L. M., & Kratochwill, T. R. (2009). Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review, 38(4), 445–459.
  • Heffernan, N., Militello, M, Heffernan, C., & Decoteau, M. (2012). Effective and meaningful use of educational technology: Three cases from the classroom. In C. Dede & J. Richards (Eds.), Digital teaching platforms (pp. 88–102). New York, NY: Teachers College Press.
  • Hulleman, C. S., & Cordray, D. S. (2009). Moving from the lab to the field: The role of fidelity and achieved relative intervention strength. Journal of Research on Educational Effectiveness, 2(1), 88–110. http://dx.doi.org/10.1080/19345740802539325
  • Khribi, M. K., Jemni, M., & Nasraoui, O. (2015). Recommendation systems for personalized technology-enhanced learning. In Kinshuk & R. Huang (Eds.), Ubiquitous learning environments and technologies (pp. 159–180). Berlin, Germany: Springer.
  • Koedinger, K. R., McLaughlin, E. A., & Heffernan, N. T. (2010). A quasi-experimental evaluation of an on-line formative assessment and tutoring system. Journal of Educational Computing Research, 43(4), 489–510. http://dx.doi.org/10.2190/EC.43.4.d
  • Loftus, W. (2012) Demonstrating success: Web analytics and continuous improvement. Journal of Web Librarianship, 6, 45–55. http://dx.doi.org/10.1080/19322909.2012.651416
  • McLaughlin, M. (1976). Implementation as mutual adaptation: Change in classroom organization. Teachers College Record, 77(3), 339–351.
  • McQuiggan, S. W., Mott, B. W., & Lester, J. C. (2008). Modeling self-efficacy in intelligent tutoring systems: An inductive approach. User Modeling and User-adapted Interaction, 18(1–2), 81–123. http://dx.doi.org/10.1007/s11257-007-9040-y
  • Means, B., Anderson, K., & Thomas, S. (2012). Expanding evidence approaches for learning in a digital world. Washington, DC: U.S. Department of Education.
  • Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, DC: U.S. Department of Education.
  • Mendicino, M., Razzaq, L., & Heffernan, N. T. (2009). A comparison of traditional homework to computer-supported homework. Journal of Research on Technology in Education, 41(3), 331–359. http://dx.doi.org/10.1080/15391523.2009.10782534
  • Monroy, C., Rangel, V. S., & Whitaker, R. (2013). STEMscopes: Contextualizing learning analytics in a K–12 science curriculum. Proceedings of the Third International Conference on Learning Analytics and Knowledge (pp. 210–219). New York, NY: ACM Press.
  • Monroy, C., Rangel, V. S., & Whitaker, R. (2014). A strategy for incorporating learning analytics into the design and evaluation of a K–12 science curriculum. Journal of Learning Analytics, 1(2), 94–125.
  • Mossberger, K., Tolbert, C., & Gilbert, M. 2006. Race, place and information technology. Urban Affairs Review, 41(4): 1–38.
  • Muñoz-Merino, P. J., Valiente, J. A. R., & Kloos, C. D. (2013, April). Inferring higher level learning information from low level data for the Khan Academy platform. Proceedings of the Third International Conference on Learning Analytics and Knowledge (pp. 112–116).
  • Nelson, M. C., Cordray, D. S., Hulleman, C. S., Darrow, C. L., & Sommer, E. C. (2012). A procedure for assessing intervention fidelity in experiments testing educational and behavioral interventions. Journal of Behavioral Health Services and Research, 39(4), 374–396. http://dx.doi.org/10.1007/s11414-012-9295-x
  • Olmos, M., & Corrin, L. (2012). Learning analytics: A case study of the process of design of visualizations. Journal of Asynchronous Learning Networks, 16(3), 39–49.
  • Özyurt, Ö., Özyurt, H., Baki, A., & Güven, B. (2013). Integration into mathematics classrooms of an adaptive and intelligent individualized e-learning environment: Implementation and evaluation of UZWEBMAT. Computers in Human Behavior, 29(3), 726–738. http://dx.doi.org/10.1016/j.chb.2012.11.013
  • Pardos, Z. A., Baker, R. S. J. D., San Pedro, M. O. C. Z, Gowda, S. M., & Gowda, S. M. (2014). Affective states and state tests: Investigating how affect and engagement during the school year predict end-of-year learning outcomes. Journal of Learning Analytics. UTS ePress, 1(1), 107–128.
  • Penuel, W. R., & Means, B. (2004). Implementation variation and fidelity in an inquiry science program: Analysis of GLOBE data reporting patterns. Journal of Research in Science Teaching, 41(3), 294–315. http://dx.doi.org/10.1002/tea.20002
  • Quint, J. C., Balu, R., DeLaurentis, M., Rappaport, S., Smith, T. J., & Zhu, P. (2013). Early findings from the Investing in Innovation (i3) scale-up. New York, NY: MDRC.
  • Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Thousand Oaks: CA: Sage.
  • Rowan, M., & Dehlinger, J. (2014). Privacy incongruity: An analysis of a survey of mobile end-users. In Proceedings of the 13th International Conference on Security and Management. Detroit, MI: University of Detroit Mercy.
  • Scheirer, M. A., & Rezmovic, E. L. (1983). Measuring the degree of program implementation. Evaluation Review, 7(5), 599–633. http://dx.doi.org/10.1177/0193841X8300700502
  • Siemens, G. (2012, April). Learning analytics: Envisioning a research discipline and a domain of practice. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 4–8). New York, NY: ACM.
  • Smith, V. C., Lange, A., & Huston, D. R. (2012). Predictive modeling to forecast student outcomes and drive effective interventions in online community college courses. Journal of Asynchronous Learning Networks, 16(3), 51–61.
  • Snodgrass Rangel, V., Bell, E. R., Monroy, C., & Whitaker, J. R. (2013, April). A year in review: Year 1 results from an evaluation of an online blended science curriculum model in an urban district. Poster presented at the American Educational Research Association Annual Meeting, San Francisco, CA.
  • Solove, D. J. (2013). Introduction: Privacy self-management and the consent dilemma. Harvard Law Review, 126, 1880.
  • Song, M., & Herman, R. (2010). Critical issues and common pitfalls in designing and conducting impact studies in education: Lessons learned from the What Works Clearinghouse. Educational Evaluation and Policy Analysis, 32(2), 351–71. http://dx.doi.org/10.3102/0162373710373389
  • Turner, S. J. (2010). Website statistics 2.0: Using Google Analytics to measure library website effectiveness. Technical Services Quarterly, 27(3), 261–78. http://dx.doi.org/10.1080/07317131003765910
  • Wang, H., & Woodworth, K. (2011, September). A randomized controlled trial of two online mathematics curricula. Paper presented at the Society for Research on Educational Effectiveness Fall 2011 Conference, Washington, DC.
  • Warschauer, M., & Matuchniak, T. (2010). New technology and digital worlds: Analyzing evidence of equity in access, use, and outcomes. Review of Research in Education, 34(1), 179–225. http://dx.doi.org/10.3102/0091732X09349791
  • Wayman, J. C., Cho, V., & Shaw, S. M. (2009a). First-year results from an efficacy study of the Acuity data system. Austin, TX: University of Texas.
  • Wayman, J. C., Cho, V., & Shaw, S. (2009b). Survey of educator data use. Austin, TX: University of Texas at Austin.
  • Wayman, J. C., Shaw, S. M., & Cho, V. (2011). Second-year results from an efficacy study of the Acuity data system. Austin, TX: University of Texas at Austin.
  • Willis, J. E. III, Campbell, J., & Pistilli, M. (2013). Ethics, big data, and analytics: A model for application. Educause Review Online. Retrieved from http://apo.org.au/research/ethics-big-data-and-analytics-model-application
  • Winne, P. H., & Baker, R. S. (2013). The potentials of educational data mining for researching metacognition, motivation and self-regulated learning. Journal of Educational Data Mining, 5(1), 1–8.
  • Xie, C., Zhang, H., Nourian, S., Pallant, A., & Bailey, S. (2014). On the instructional sensitivity of CAD logs. International Journal of Engineering Education, 30(4), 760–778.
  • Xie, C., Zhang, H., Nourian, S., Pallant, A., & Hazzard, E. (2014). A time series analysis method for assessing engineering design processes using a CAD tool, International Journal of Engineering Education, 30(1), 218–230.
  • Yu, Taeho & Jo, Il-Hyun. (2014). Educational technology approach toward learning analytics: Relationship between student online behaviour and learning performance in higher education. Proceedings of the Fourth International Conference on Learning, pp. 269–270. http://dx.doi.org/10.1145/2567574.2567594
  • Zuiker, S., & Whitaker, J. R. (2014). Refining inquiry with multi-form assessment: Formative and summative assessment functions for flexible inquiry. International Journal of Science Education, 36(6), 1037–1059. http://dx.doi.org/10.1080/09500693.2013.834489

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.