Abstract
Implementation planning is a consultative educator support strategy with preliminary evidence of effectiveness from single-case design research. This meta-analysis investigates the impacts of implementation planning on educator implementation and student outcomes and explores moderators of these relationships. Principles of open science and protections against publication bias are applied to encourage confidence, enhance transparency, and support reproducibility. Seven multiple-baseline studies, including published journal articles and unpublished dissertations, were identified using all possible combinations of the terms: (a) implementation planning, action planning, or coping planning, and (b) fidelity or integrity. Log response ratio effect sizes were calculated for each of the 31 included cases across all relevant dependent variables. A random effects meta-regression model was estimated for each dependent variable and moderator analyses were conducted to determine the impacts of implementer role, case type, intervention type, delivery modality, baseline type, and publication type. Results indicate that implementation planning had a significant effect on adherence and non-significant effects on quality, disruptive behavior, and academic engagement. Moderator analyses were significant for intervention type; implementation planning was more effective for behavior support plans than academic interventions or classroom management plans. Limitations and implications for open science research and school psychology practice are discussed.
Impact Statement
This meta-analysis evaluates the impacts of a consultation-based support strategy – implementation planning – on educator implementation and student outcomes. Data from seven studies with 31 unique cases are summarized. Findings suggest that implementation planning has a significant effect on educators’ delivery of student interventions, particularly when delivering individualized behavior support plans.
Associate Editor:
Additional information
Funding
Notes on contributors
Alexandra M. Pierce
Alexandra M. Pierce, PhD is an assistant research professor for the Institute for Collaboration on Health, Intervention, and Policy (InCHIP) at the University of Connecticut. Her research interests include implementation science, positive behavior intervention, consultation, and treatment fidelity measurement.
Lisa M. H. Sanetti
Lisa M. H. Sanetti, PhD is a professor of school psychology in the Neag School of Education at the University of Connecticut. Her primary research interests include implementation science and educator well-being.
Melissa A. Collier-Meek
Melissa A. Collier-Meek, PhD, BCBA is an associate professor of school psychology at Teachers College, Columbia University. Her research interests include school-based implementation, implementation support, teacher consultation and coaching, multitiered systems of support, and function-based behavior intervention.
Austin H. Johnson
Austin H. Johnson, PhD, is an Associate Professor in the School of Education at the University of California, Riverside, and serves as the Associate Dean of Undergraduate Education within the School of Education. Austin received his PhD in Educational Psychology, with a concentration in School Psychology, from the University of Connecticut in 2014.