659
Views
6
CrossRef citations to date
0
Altmetric
Theory, Contexts, and Mechanisms

Examining Factors Contributing to Variation in Effect Size Estimates of Teacher Outcomes from Studies of Science Teacher Professional Development

ORCID Icon, ORCID Icon, , ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon show all
Pages 430-458 | Received 20 Nov 2018, Accepted 21 Jan 2020, Published online: 06 Apr 2020

References

  • Blank, R. K., & de las Alas, N. (2009). The effects of teacher professional development on gains in student achievement: How meta-analysis provides scientific evidence useful to education leaders. Washington, DC: Council of Chief State School Officers.
  • Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. R. (2009). Introduction to meta-analysis. Chichester, UK: John Wiley & Sons.
  • Borko, H., Jacobs, J., & Koellner, K. (2010). Contemporary approaches to teacher professional development. In P. Peterson, E. Baker, & B. McGaw (Eds.), International encyclopedia of education (Vol. 7, pp. 548–556). Oxford: Elsevier.
  • Capps, D. K., Crawford, B. A., & Constas, M. A. (2012). A review of empirical literature on inquiry professional development: Alignment with best practices and a critique of the findings. Journal of Science Teacher Education, 23(3), 291–318.
  • Caro, D. H. (2015). Causal mediation in educational research: An illustration using international assessment data. Journal of Research on Educational Effectiveness, 8(4), 577–597. doi:10.1080/19345747.2015.1086913
  • Chetty, R., Friedman, J. N., & Rockoff, J. E. (2014). Measuring the impacts of teachers II: Teacher value-added and student outcomes in adulthood. American Economic Review, 104(9), 2633–2679. doi:10.1257/aer.104.9.2633
  • Cheung, A. C., & Slavin, R. E. (2016). How methodological features affect effect sizes in education. Educational Researcher, 45(5), 283–292. doi:10.3102/0013189X16656615
  • Cheung, A., Slavin, R. E., Kim, E., & Lake, C. (2017). Effective secondary science programs: A best-evidence synthesis. Journal of Research in Science Teaching, 54(1), 58–81.
  • Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment, 6(4), 284–290. doi:10.1037/1040-3590.6.4.284
  • Cochrane Collaboration. (2014). Results should not be reported as statistically significant or statistically non-significant. Retrieved from http://editorial-unit.cochrane.org/blog/results-should-not-be-reported-statistically-significant-or-statistically-non-significant
  • Cohen, D. K., & Hill, H. C. (2000). Instructional policy and classroom performance: The mathematics reform in California. Teachers College Record, 102(2), 294–343. doi:10.1111/0161-4681.00057
  • Cooper, H., Hedges, L. V., & Valentine, J. C. (Eds.). (2009). The handbook of research synthesis and meta-analysis. New York, NY: Russell Sage Foundation.
  • Desimone, L. M. (2009). Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educational Researcher, 38(3), 181–199. doi:10.3102/0013189X08331140
  • Desimone, L. M., & Garet, M. S. (2015). Best practices in teachers’ professional development in the United States. Psychology, Society, & Education, 7(3), 252–263. doi:10.25115/psye.v7i3.515
  • Fisher, Z., & Tipton, E. (2015). Robumeta: An R-package for robust variance estimation in meta-analysis. arXiv:1503.02220. Retrieved from https://arxiv.org/pdf/1503.02220v1
  • Fisher, Z., Tipton, E., & Hou, Z. (2017). Robumeta: Robust variance meta-regression. R package (Version 2.0) [Computer software]. Retrieved from https://CRAN.R-project.org/package=robumeta
  • Fullan, M., & Steigelbauer, S. (1991). The new meaning of educational change. New York, NY: Teachers’ College Press.
  • Garet, M., Wayne, A., Stancavage, F., Taylor, J., Eaton, M., Walters, K., & Doolittle, F. (2011). Middle school mathematics professional development impact study: Findings after the second year of implementation (NCEE 2011-4024). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
  • Garet, M. S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., … Silverberg, M. (2008). The impact of two professional development interventions on early reading instruction and achievement. Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
  • Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915–945. doi:10.3102/00028312038004915
  • Garet, M. S., Wayne, A. J., Stancavage, F., Taylor, J., Walters, K., Song, M., & Doolittle, F. (2010). Middle school mathematics professional development impact study: Findings after the first year of implementation (NCEE 2010-4009). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
  • Gersten, R., Dimino, J., Jayanthi, M., Kim, J. S., & Santoro, L. E. (2010). Teacher study group: Impact of the professional development model on reading instruction and student outcomes in first grade classrooms. American Educational Research Journal, 47(3), 694–739. doi:10.3102/0002831209361208
  • Hedges, L. V., & Hedberg, E. C. (2007). Intraclass correlation values for planning group-randomized trials in education. Educational Evaluation and Policy Analysis, 29(1), 60–87. doi:10.3102/0162373707299706
  • Hedges, L. V., Tipton, E., & Johnson, M. C. (2010a). Robust variance estimation in meta‐regression with dependent effect size estimates. Research Synthesis Methods, 1(1), 39–65. doi:10.1002/jrsm.5
  • Hedges, L. V., Tipton, E., & Johnson, M. C. (2010b). Erratum: Robust variance estimation in meta‐regression with dependent effect size estimates. Research Synthesis Methods, 1(2), 164–165. doi:10.1002/jrsm.17
  • Hewitt, C. E., Mitchell, N., & Torgerson, D. J. (2008). Listen to the data when results are not significant. BMJ, 336(7634), 23–25. doi:10.1136/bmj.39379.359560.AD
  • Hill, C. J., Bloom, H. S., Black, A. R., & Lipsey, M. W. (2008). Empirical benchmarks for interpreting effect sizes in research. Child Development Perspectives, 2(3), 172–177. doi:10.1111/j.1750-8606.2008.00061.x
  • Hill, H. C., Beisiegel, M., & Jacob, R. (2013). Professional development research: Consensus, crossroads, and challenges. Educational Researcher, 42(9), 476–487. doi:10.3102/0013189X13512674
  • Jacob, R., Hill, H., & Corey, D. (2017). The impact of a professional development program on teachers’ mathematical knowledge for teaching, instruction, and student achievement. Journal of Research on Educational Effectiveness, 10(2), 379–407. doi:10.1080/19345747.2016.1273411
  • Kane, T. J., McCaffrey, D. F., Miller, T., & Staiger, D. O. (2013). Have we identified effective teachers? Validating measures of effective teaching using random assignment. [Research Paper]. MET Project. Bill & Melinda Gates Foundation. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.638.2716
  • Lee, O., Llosa, L., Jiang, F., Haas, A., O’Connor, C., & Van Booven, C. D. (2016). Elementary teachers’ science knowledge and instructional practices: Impact of an intervention focused on English language learners. Journal of Research in Science Teaching, 53(4), 579–597. doi:10.1002/tea.21314
  • Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Thousand Oaks, CA: SAGE.
  • Llosa, L., Lee, O., Jiang, F., Haas, A., O’Connor, C., Van Booven, C. D., & Kieffer, M. J. (2016). Impact of a large-scale science intervention focused on English language learners. American Educational Research Journal, 53(2), 395–424. doi:10.3102/0002831216637348
  • Lynch, K., Hill, H. C., Gonzalez, K. E., & Pollard, C. (2019). Strengthening the research base that informs STEM instructional improvement efforts: A meta-analysis. Educational Evaluation and Policy Analysis, 41(3), 260–293. doi:10.3102/0162373719849044
  • National Academies of Sciences, Engineering, and Medicine. (2015). Science teachers’ learning: Enhancing opportunities, creating supportive contexts. Washington, DC: The National Academies Press.
  • National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.
  • Nye, B., Konstantopoulos, S., & Hedges, L. V. (2004). How large are teacher effects? Educational Evaluation and Policy Analysis, 26(3), 237–257. doi:10.3102/01623737026003237
  • Penuel, W. R., Fishman, B. J., Yamaguchi, R., & Gallagher, L. P. (2007). What makes professional development effective? Strategies that foster curriculum implementation. American Educational Research Journal, 44(4), 921–958. doi:10.3102/0002831207308221
  • Penuel, W. R., Gallagher, L. P., & Moorthy, S. (2011). Preparing teachers to design sequences of instruction in earth systems science: A comparison of three professional development programs. American Educational Research Journal, 48(4), 996–1025. doi:10.3102/0002831211410864
  • Podgursky, M. J., & Springer, M. G. (2007). Teacher performance pay: A review. Journal of Policy Analysis and Management, 26(4), 909–950. doi:10.1002/pam.20292
  • Polanin, J. R., Tanner-Smith, E. E., & Hennessy, E. A. (2016). Estimating the difference between published and unpublished effect sizes: A meta-review. Review of Educational Research, 86(1), 207–236. doi:10.3102/0034654315582067
  • Pustejovsky, J. (2018). clubSandwich: Cluster-robust (sandwich) variance estimators with small-sample corrections. R package version 0.3.0. Retrieved from https://github.com/jepusto/clubSandwich
  • Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 29(1), 4–15.
  • Roth, K. J., Wilson, C. D., Taylor, J. A., Stuhlsatz, M. A., & Hvidsten, C. (2019). Comparing the effects of analysis-of-practice and content-based professional development on teacher and student outcomes in science. American Educational Research Journal, 56(4), 1217–1253. doi:10.3102/0002831218814759
  • Scher, L., & O’Reilly, F. (2009). Professional development for K–12 math and science teachers: What do we really know? Journal of Research on Educational Effectiveness, 2(3), 209–249. doi:10.1080/19345740802641527
  • Schochet, P. Z. (2008). Statistical power for random assignment evaluations of education programs. Journal of Educational and Behavioral Statistics, 33(1), 62–87. doi:10.3102/1076998607302714
  • Slavin, R. E., & Madden, N. A. (2011). Measures inherent to treatments in program effectiveness reviews. Journal of Research on Educational Effectiveness, 4(4), 370–380. doi:10.1080/19345747.2011.558986
  • Spybrook, J., Bloom, H., Congdon, R., Hill, C., Martinez, A., Raudenbush, S., & To, A. (2011). Optimal design plus empirical evidence: Documentation for the “Optimal Design” software. New York, NY: William T. Grant Foundation.
  • Tanner-Smith, E. E., Tipton, E., & Polanin, J. R. (2016). Handling complex meta-analytic data structures using robust variance estimates: A tutorial in R. Journal of Developmental and Life-Course Criminology, 2(1), 85–112. doi:10.1007/s40865-016-0026-5
  • Taylor, J. A., Kowalski, S. M., Polanin, J. R., Askinas, K., Stuhlsatz, M. A., Wilson, C. D., … Wilson, S. J. (2018). Investigating science education effect sizes: Implications for power analyses and programmatic decisions. AERA Open, 4(3), 233285841879199. doi:10.1177/2332858418791991
  • Tipton, E. (2015). Small sample adjustments for robust variance estimation with meta-regression. Psychological Methods, 20(3), 375–393. doi:10.1037/met0000011
  • Tipton, E., & Pustejovsky, J. E. (2015). Small-sample adjustments for tests of moderators and model fit using robust variance estimation in meta-regression. Journal of Educational and Behavioral Statistics, 40(6), 604–634. doi:10.3102/1076998615606099
  • Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36(3), 1–48. doi:10.18637/jss.v036.i03
  • Wallace, B. C., Small, K., Brodley, C. E., Lau, J., & Trikalinos, T. A. (2012, January). Deploying an interactive machine learning system in an evidence-based practice center: Abstrackr. Proceedings of the 2nd ACM International Health Informatics Symposium (pp. 819–824). New York, NY: ACM.
  • Westine, C., Unlu, F., Taylor, J., Spybrook, J., Zhang, Q., & Anderson, B. (under review). Design parameter values for impact evaluations of science and mathematics interventions involving teacher outcomes.
  • Wilson, D. B. (n.d). Practical meta-analysis effect size calculator [online calculator]. Retrieved from https://www.campbellcollaboration.org/research-resources/research-for-resources/effect-size-calculator.html

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.