1,040
Views
14
CrossRef citations to date
0
Altmetric
Methodological Studies

Making Sense of Effect Sizes: Systematic Differences in Intervention Effect Sizes by Outcome Measure TypeOpen Data

ORCID Icon & ORCID Icon
Pages 134-161 | Received 28 May 2021, Accepted 31 Mar 2022, Published online: 01 Aug 2022

References

  • Baird, M. D., & Pane, J. F. (2019). Translating standardized effects of education programs into more interpretable metrics. Educational Researcher, 48(4), 217–228. https://doi.org/10.3102/0013189X19848729
  • Baye, A., Lake, C., Inns, A., & Slavin, R. (2019). A synthesis of quantitative research on reading programs for secondary students. Reading Research Quarterly, 54(2), 133–166.
  • Bloom, H. S., Hill, C. J., Black, A. R., & Lipsey, M. W. (2008). Performance trajectories and performance gaps as achievement effect-size benchmarks for educational interventions. Journal of Research on Educational Effectiveness, 1(4), 289–328. https://doi.org/10.1080/19345740802400072
  • Cheung, A., & Slavin, R. (2016). How methodological features affect effect sizes in education. Educational Researcher, 45(5), 283–292. https://doi.org/10.3102/0013189X16656615
  • Coburn, K., & Vevea, J. (2019). weightr: Estimating weight-function models for publication bias. R package version 2.0.2. https://CRAN.R-project.org/package=weightr
  • de Boer, H., Donker, A., & van der Werf, M. (2014). Effects of the attributes of educational interventions on students’ academic performance: A meta-analysis. Review of Educational Research, 84(4), 509–545. https://doi.org/10.3102/0034654314540006
  • Dietrichson, J., Bøg, M., Filges, T., & Klint Jørgensen, A.-M. (2017). Academic interventions for elementary and middle school students with low socioeconomic status: A systematic review and meta-analysis. Review of Educational Research, 87(2), 243–282. https://doi.org/10.3102/0034654316687036
  • Fryer, R. Jr. (2017). The production of human capital in developed countries: Evidence from 196 randomized field experiments. In A. Vinayak Banerjee, & E. Duflo (Eds.), Handbook of economic field experiments (Vol. 2, pp. 95–322). North-Holland.
  • Gersten, R., Haymond, K., Newman-Gonchar, R., Dimino, J., & Jayanthi, M. (2020). Meta-analysis of the impact of reading interventions for students in the primary grades. Journal of Research on Educational Effectiveness, 13(2), 401–427. https://doi.org/10.1080/19345747.2019.1689591
  • Hedges, L. (2007). Effect sizes in cluster-randomized designs. Journal of Educational and Behavioral Statistics, 32(4), 341–370. https://doi.org/10.3102/1076998606298043
  • Hedges, L., Tipton, E., & Johnson, M. (2010). Robust variance estimation in meta-regression with dependent effect size estimates. Research Synthesis Methods, 1(1), 39–65. https://doi.org/10.1002/jrsm.5
  • Hill, C. J., Bloom, H. S., Black, A. R., & Lipsey, M. W. (2008). Empirical benchmarks for interpreting effect sizes in research. Child Development Perspectives, 2(3), 172–177. https://doi.org/10.1111/j.1750-8606.2008.00061.x
  • Institute for Education Sciences. (2021). Standards for excellence in education research. https://ies.ed.gov/seer/index.asp
  • John, L., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532.
  • Kraft, M. A. (2020). Interpreting effect sizes of education interventions. Educational Researcher, 49(4), 241–253. https://doi.org/10.3102/0013189X20912798
  • Kulik, J. A., & Fletcher, J. D. (2016). Effectiveness of intelligent tutoring systems: A meta-analytic review. Review of Educational Research, 86(1), 42–78. https://doi.org/10.3102/0034654315581420
  • Li, Q., & Ma, X. (2010). A meta-analysis of the effects of computer technology on school students’ mathematics learning. Educational Psychology Review, 22(3), 215–243. https://doi.org/10.1007/s10648-010-9125-8
  • Lipsey, M. W. (2009). The primary factors that characterize effective interventions with juvenile offenders: A meta-analytic overview. Victims & Offenders, 4(2), 124–147. https://doi.org/10.1080/15564880802612573
  • Lipsey, M. W., Puzio, K., Yun, C., Hebert, M. A., Steinka-Fry, K., Cole, M. W., Roberts, M., Anthony, K. S., & Busick, M. D. (2012). Translating the statistical representation of the effects of education interventions into more readily interpretable forms. National Center for Special Education Research.
  • Lortie-Forgues, H., & Inglis, M. (2019). Rigorous large-scale RCTs are often uninformative: Should we be concerned? Educational Researcher, 48(3), 158–166. https://doi.org/10.3102/0013189X19832850
  • Lynch, K., Hill, H. C., Gonzalez, K. E., & Pollard, C. (2019). Strengthening the research base that informs STEM instructional improvement efforts: A meta-analysis. Educational Evaluation and Policy Analysis, 41(3), 260–293. https://doi.org/10.3102/0162373719849044
  • McBee, M., Makel, M., Peters, S., & Matthews, M. (2017). A manifesto for open science in giftedness research. osf.io/qhwg3
  • Olkin, I., & Gleser, L. (2009). Stochastically dependent effect sizes. In H. Cooper, L. Hedges, & J. Valentine (Eds.), The handbook of research synthesis and meta-analysis (pp. 357–376). Russell Sage Foundation.
  • Pellegrini, M., Inns, A., Lake, C., & Slavin, R. (2019). Effects of researcher-made versus independent measures on outcomes of experiments in education [Paper presentation]. Annual Meeting of the Society for Research on Educational Effectiveness, Washington, DC, USA.
  • Petrosino, A., & Soydan, H. (2005). The impact of program developers as evaluators on criminal recidivism: Results from meta-analyses of experimental and quasi-experimental research. Journal of Experimental Criminology, 1(4), 435–450. https://doi.org/10.1007/s11292-005-3540-8
  • Polanin, J. R., Tanner-Smith, E. E., & Hennessy, E. A. (2016). Estimating the difference between published and unpublished effect sizes: A meta-review. Review of Educational Research, 86(1), 207–236. https://doi.org/10.3102/0034654315582067
  • Pustejovsky, J. (2019). clubSandwich: Cluster-robust (sandwich) variance estimators with small-sample corrections. R package version 0.3.5. https://CRAN.R-project.org/package=clubSandwich
  • R Core Team. (2018). R: A language and environment for statistical computing. R Foundation for Statistical Computing. https://www.R-project.org/
  • Rossi, P. H., Lipsey, M. W., & Henry, G. T. (2019). Evaluation: A systematic approach (8th ed.). Sage.
  • Ruiz‐Primo, M. A., Shavelson, R. J., Hamilton, L., & Klein, S. (2002). On the evaluation of systemic science education reform: Searching for instructional sensitivity. Journal of Research in Science Teaching, 39(5), 369–393. https://doi.org/10.1002/tea.10027
  • Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366.
  • Slavin, R. (2020). Meta-analysis or muddle-analysis? Robert Slavin’s Blog. https://robertslavinsblog.wordpress.com/2020/10/08/meta-analysis-or-muddle-analysis/
  • Slavin, R., & Lake, C. (2008). Effective programs in elementary mathematics: A best-evidence synthesis. Review of Educational Research, 78(3), 427–515. https://doi.org/10.3102/0034654308317473
  • Slavin, R., Lake, C., & Groff, C. (2009). Effective programs in middle and high school mathematics: A best-evidence synthesis. Review of Educational Research, 79(2), 839–911. https://doi.org/10.3102/0034654308330968
  • Song, M., & Herman, R. (2010). Critical issues and common pitfalls in designing and conducting impact studies in education: Lessons learned from the What Works Clearinghouse (Phase I). Educational Evaluation and Policy Analysis, 32(3), 351–371. https://doi.org/10.3102/0162373710373389
  • Statistics, Website, and Training. (2020). How education leaders use the What Works Clearinghouse website (WWC report) [Manuscript awaiting peer review]. American Institutes for Research (AIR).
  • Statistics, Website, and Training (SWAT) Measurement Small Group. (2020). Preliminary analysis of effect sizes associated with researcher-developed measures (WWC report)[Manuscript awaiting peer review]. American Institutes for Research (AIR).
  • Vevea, J. L., & Hedges, L. V. (1995). A general linear model for estimating effect size in the presence of publication bias. Psychometrika, 60(3), 419–435. https://doi.org/10.1007/BF02294384
  • Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36(3), 1–48. https://doi.org/10.18637/jss.v036.i03
  • Wasserstein, R. L., Schirm, A. L., & Lazar, N. A. (2019). Moving to a world beyond “p < 0.05”. The American Statistician, 73(sup1), 1–19.
  • What Works Clearinghouse. (2020). Using the to find ESSA tiers of evidence. https://ies.ed.gov/ncee/wwc/essa
  • Wiernik, B. M., & Dahlke, J. A. (2020). Obtaining unbiased results in meta-analysis: The importance of correcting for statistical artifacts. Advances in Methods and Practices in Psychological Science, 3(1), 94–123. https://doi.org/10.1177/2515245919885611
  • Williams, R., Citkowicz, M., Lindsay, J., Miller, D., & Walters, K. (2022). Heterogeneity in mathematics intervention effects: Results from a meta-analysis of 191 randomized experiments. Journal of Research on Educational Effectiveness, 1–51. https://doi.org/10.1080/19345747.2021.2009072
  • Wilson, D., Gottfredson, D., & Najaka, S. (2001). School-based prevention of problem behaviors: A meta-analysis. Journal of Quantitative Criminology, 17(3), 247–272. https://doi.org/10.1023/A:1011050217296
  • Wilson, D., & Lipsey, M. (2001). The role of method in treatment effectiveness research: Evidence from meta-analysis. Psychological Methods, 6(4), 413–429.
  • Wolf, R., Morrison, J., Inns, A., Slavin, R., & Risman, K. (2020). Average effect sizes in developer-commissioned and independent evaluations. Journal of Research on Educational Effectiveness, 13(2), 428–447. https://doi.org/10.1080/19345747.2020.1726537
  • WWC Standards Handbook Version 4.1. (2020). What Works Clearinghouse standards handbook, version 4.1. US Department of Education, Institute of Education Sciences. National Center for Education Evaluation and Regional Assistance. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/WWC-Standards-Handbook-v4-1-508.pdf

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.