69
Views
0
CrossRef citations to date
0
Altmetric
Articles

Re-conceptualizing SMART designs as a hybrid of randomized and regression discontinuity designs: opportunities, cautions

ORCID Icon
Pages 140-155 | Received 10 Feb 2021, Accepted 18 May 2023, Published online: 04 Jun 2023

References

  • Almirall, D., et al., 2018. Developing optimized adaptive interventions in education. Journal of Research on Educational Effectiveness, 11 (1), 27–34. doi:10.1080/19345747.2017.1407136.
  • Alpert, W.T., Couch, K.A., and Harmon, O.R., 2016. A randomized assessment of online learning. American Economic Review, 106 (5), 378–382. doi:10.1257/aer.p20161057.
  • Artman, W.J., et al., 2020. Power analysis in a SMART design: sample size estimation for determining the best embedded dynamic treatment regime. Biostatistics (Oxford, England), 21 (3), 432–448. doi:10.1093/biostatistics/kxy064.
  • Black, D., Galdo, J., and Smith, J.A., 2007. Evaluating the regression discontinuity design using experimental data. Unpublished manuscript. U. of Chicago.
  • Bloom, H.S., 2012. Modern regression discontinuity analysis. Journal of Research on Educational Effectiveness, 5 (1), 43–82. doi:10.1080/19345747.2011.578707.
  • Campbell, D.T., 1969. Ethnocentrism of disciplines and the fish scale model of omniscience. In: M. Sherif, and C.W. Sharif, eds. Interdisciplinary relationships in the social sciences. Chicago: Aldine, 328–348.
  • Chakraborty, B., and Murphy, S.A., 2014. Dynamic treatment regimes. Annual Review of Statistics and Its Application, 1, 447–464. doi:10.1146/annurev-statistics-022513-115553.
  • Chaplin, D.D., et al., 2018. The internal and external validity of the regression discontinuity design: a meta-analysis of 15 within-study comparisons. Journal of Policy Analysis and Management, 37 (2), 403–429. doi:10.1002/pam.22051.
  • Chow, J.C., and Hampton, L.H., 2019. Sequential multiple-assignment randomized trials: developing and evaluating adaptive interventions in special education. Remedial and Special Education, 40 (5), 267–276. doi:10.1177/0741932518759422.
  • Connor, C.M., et al., 2013. A longitudinal cluster-randomized controlled study on the accumulating effects of individualized literacy instruction on students’ reading from first through third grade.. Psychological Science, 24 (8), 1408–1419. doi:10.1177/0956797612472204.
  • Cook, T.D., 2008. Waiting for life to arrive: a history of the regression-discontinuity design in psychology, statistics, and economics. Journal of Econometrics, 142 (2), 636–654.
  • Cook, T.D., 2018. Twenty-six assumptions that have to be met if single random assignment experiments are to warrant “gold standard” status: a commentary on Deaton and Cartwright. Social Science & Medicine, 210, 37–40. doi:10.1016/j.socscimed.2018.04.031.
  • Cook, T.D., Shadish, W.J., and Wong, V.C., 2008. Three conditions under which observational studies produce the same results as experiments. Journal of Policy Analysis and Management, 27 (4), 724–750. doi:10.1002/pam.20375.
  • Hedges, L.V., 2018. Challenges in building usable knowledge in education. Journal of Research on Educational Effectiveness, 11 (1), 1–21. doi:10.1080/19345747.2017.1375583.
  • Imbens, G.W., and Rubin, D.B., 2015. Causal inference in statistics, social, and biomedical sciences. Cambridge: Cambridge University Press.
  • Kim, J.S., et al., 2019. Using a sequential multiple assignment randomized trial (SMART) to develop an adaptive K–2 literacy intervention with personalized print texts and app-based digital activities. AERA Open, 5 (3), 1–18. doi:10.1177/2332858419872701.
  • Kirk, R.E., 1968. Experimental design: procedures for the behavioral sciences. Belmont, CA: Brooks/Cole.
  • Kisbu-Sakarya, Y., et al., 2018. Comparative regression discontinuity: a stress test with small samples. Evaluation Review, 42 (1), 111–143. doi:10.1177/0193841X18776881.
  • Lei, H., et al., 2012. A “SMART” design for building individualized treatment sequences. Annual Review of Clinical Psychology, 8, 21–48. doi:10.1146/annurev-clinpsy-032511-143152.
  • Lundberg, I., Johnson, R., and Stewart, B.M., 2021. What is your estimand? Defining the target quantity connects statistical evidence to theory. American Sociological Review, 86 (3), 532–565. doi:10.1177/00031224211004187.
  • McCrary, J., 2008. Manipulation of the running variable in the regression discontinuity design: a density test. Journal of Econometrics, 142 (2), 698–714. doi:10.1016/j.jeconom.2007.05.005.
  • Moss, B.G., and Yeaton, W.H., 2014. Evaluating the effectiveness of developmental mathematics by embedding a randomized experiment within a regression discontinuity design. Educational Evaluation and Policy Analysis, 36 (2), 170–185. doi:10.3102/0162373713504988.
  • Murphy, S.A., 2005. An experimental design for the development of adaptive treatment strategies. Statistics in Medicine, 24 (10), 1455–1481. doi:10.1002/sim.2022.
  • Nahum-Shani, I., and Almirall, D. 2019. An introduction to adaptive interventions and SMART designs in education (NCSER 2020-001). U.S. department of education. Washington, DC: National Center for Special Education Research. Available from https://ies.ed.gov/ncser/pubs/ [Accessed November 9 November 2020].
  • Nahum-Shani, I., et al., 2017. A SMART data analysis method for constructing adaptive treatment strategies for substance use disorders. Addiction, 112 (5), 901–909. doi:10.1111/add.13743.
  • NeCamp, T., Kilbourne, A., and Almirall, D., 2017. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: regression estimation and sample size considerations. Statistical Methods in Medical Research, 26 (4), 1572–1589. doi:10.1177/0962280217708654.
  • NeCamp, T.A., 2019. Design and analysis of sequential randomized trials with applications to mental health and on-line education. PhD diss. University of Michigan.
  • Noser, A.E., et al., 2017. Adaptive intervention designs in pediatric psychology: the promise of sequential multiple assignment randomized trials of pediatric interventions. Clinical Practice in Pediatric Psychology, 5 (2), 170–179. doi:10.1037/cpp0000185.
  • Pelham, W.E., et al., 2016. Treatment sequencing for childhood ADHD: A multiple-randomization study of adaptive medication and behavioral interventions. Journal of Clinical Child & Adolescent Psychology, 45 (4), 396–415. doi:10.1080/15374416.2015.1105138.
  • Reichardt, C.S., 2019. Quasi-experimentation: a guide to design and analysis. New York, NY: Guilford.
  • Rosenbaum, P.R., 2015. Cochran’s causal crossword. Observational Studies, 1, 205–211.
  • Seewald, N.J., et al., 2020. Sample size considerations for comparing dynamic treatment regimens in a sequential multiple-assignment randomized trial with a continuous longitudinal outcome. Statistical Methods in Medical Research, 29 (7), 1891–1912. doi:10.1177/0962280219877520.
  • Shadish, W.R., 1993. Critical multiplism: a research strategy and its attendant tactics. In: L. Sechrest, ed. New directions for program evaluation. San Francisco, CA: Jossey-Bass, 60, 13–57.
  • Shadish, W.R., Clark, M.H., and Steiner, P.M., 2008. Can nonrandomized experiments yield accurate answers? A randomized experiment comparing random and nonrandom assignments. Journal of the American Statistical Association, 103 (484), 1334–1344. doi:10.1198/016214508000000733.
  • Shadish, W.R., Cook, T.D., and Campbell, D.T., 2002. Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin.
  • Steiner, P.M., and Wong, V.C., 2018. Assessing correspondence between experimental and nonexperimental estimates in within-study comparisons. Evaluation Review, 42 (2), 214–247. doi:10.1177/0193841X18773807.
  • Steiner, P.M., et al., 2017. Graphical models for quasi-experimental designs. Sociological Methods & Research, 46 (2), 155–188. doi:10.1177/0049124115582272.
  • Steiner, P.M., Wong, V.C., and Anglin, K., 2019. A causal replication framework for designing and assessing replication efforts. Zeitschrift für Psychologie, 227 (4), 280–292. doi:10.1027/2151-2604/a000385.
  • Tang, Y., et al., 2017. Advances in econometrics. Advances in Econometrics, 38, 237–279. doi:10.1108/S0731-905320170000038011.
  • Trochim, W., 1984. Research designs for program evaluation: the regression-discontinuity approach. Newbury Park, CA: Sage.
  • Trochim, W., 1990. The regression discontinuity design. In: L. Sechrest, E. Perrin, and J. Bunker, eds. Research methodology: strengthening causal interpretations of nonexperimental data. Rockville, MD: Agency for Health Care Policy and Research, 119–140.
  • U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, What Works Clearinghouse. 2020. Standards Handbook: Version 4.1. Available from https://ies.ed.gov/ncee/wwc/Handbooks.
  • Valentine, J.C., Konstantopoulos, S., and Goldrick-Rab, S., 2017. What happens to students placed into developmental education? A meta-analysis of regression discontinuity studies. Review of Educational Research, 87 (4), 806–833. doi:10.3102/0034654317709237.
  • Wong, V.C., Valentine, J., and Miller-Bains, K., 2017. Empirical performance of covariates in education observational studies. Journal of Research on Educational Effectiveness, 10 (1), 207–236. doi:10.1080/19345747.2016.1164781.
  • Yeaton, W.H., and Moss, B.G., 2020. A multiple-design, experimental strategy: academic probation warning letter’s impact on student achievement. The Journal of Experimental Education, 88 (4), 123–144. doi:10.1080/00220973.2018.1469110.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.