449
Views
3
CrossRef citations to date
0
Altmetric
INTEGRATING ATD AND CCD

Meta-analysis of single-case experimental designs: How can alternating treatments and changing criterion designs be included?

ORCID Icon, ORCID Icon & ORCID Icon

References

  • Allison, D. B., Faith, M. S., & Franklin, R. D. (1995). Antecedent exercise in the treatment of disruptive behavior: A meta-analytic review. Clinical Psychology: Science and Practice, 2(3), 279–304. https://doi.org/10.1111/j.1468-2850.1995.tb00045.x
  • Allison, D. B., & Gorman, B. S. (1993). Calculating effect sizes for meta-analysis: The case of the single case. Behaviour Research and Therapy, 31(6), 621–631. https://doi.org/10.1016/0005-7967(93)90115-B
  • Angell, M. E., Nicholson, J. K., Watts, E. H., & Blum, C. (2011). Using a multicomponent adapted power card strategy to decrease latency during interactivity transitions for three children with developmental disabilities. Focus on Autism and Other Developmental Disabilities, 26(4), 206–217. https://doi.org/10.1177/1088357611421169
  • Barlow, D. H., & Hayes, S. C. (1979). Alternating treatments design: One strategy for comparing the effects of two treatments in a single subject. Journal of Applied Behavior Analysis, 12(2), 199–210. https://doi.org/10.1901/jaba.1979.12-199
  • Barton, E. E., Pustejovsky, J. E., Maggin, D. M., & Reichow, B. (2017). Technology-aided instruction and intervention for students with ASD: A meta-analysis using novel methods of estimating effect sizes for single-case research. Remedial and Special Education, 38(6), 371–386. https://doi.org/10.1177/0741932517729508
  • Becraft, J. L., Borrero, J. C., Sun, S., & McKenzie, A. A. (2020). A primer for using multilevel models to meta‐analyze single case design data with AB phases. Journal of Applied Behavior Analysis, 53(3), 1799–1821. https://doi.org/10.1002/jaba.698
  • Browder, D. M., & Xin, Y. P. (1998). A meta-analysis and review of sight word research and its implications for teaching functional reading to individuals with moderate and severe disabilities. The Journal of Special Education, 32(3), 130–153. https://doi.org/10.1177/002246699803200301
  • Burke, M. D., Boon, R. T., Hatton, H., & Bowman-Perrott, L. (2015). Reading interventions for middle and secondary students with emotional and behavioral disorders: A quantitative review of single-case studies. Behavior Modification, 39(1), 43–68. https://doi.org/10.1177/0145445514547958
  • Burns, M. K., Zaslofsky, A. F., Kanive, R., & Parker, D. C. (2012). Meta-analysis of incremental rehearsal using Phi coefficients to compare single-case and group designs. Journal of Behavioral Education, 21(3), 185–202. https://doi.org/10.1007/s10864-012-9160-2
  • Busk, P. L., & Serlin, R. C. (1992). Meta-analysis for single-case research. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case research designs and analysis: New directions for psychology and education (pp. 187–212). Erlbaum.
  • Carr, M. E., Moore, D. W., & Anderson, A. (2014). Self-management interventions on students with autism: A meta-analysis of single-subject research. Exceptional Children, 81(1), 28–44. https://doi.org/10.1177/0014402914532235
  • Carr, J. E., Severtson, J. M., & Lepper, T. L. (2009). Noncontingent reinforcement is an empirically supported treatment for problem behavior exhibited by individuals with developmental disabilities. Research in Developmental Disabilities, 30(1), 44–57. https://doi.org/10.1016/j.ridd.2008.03.002
  • Carter, M. (2013). Reconsidering overlap-based measures for quantitative synthesis of single-subject data: What they tell us and what they don’t. Behavior Modification, 37(3), 378–390. https://doi.org/10.1177/0145445513476609
  • Craig, A. R., & Fisher, W. W. (2019). Randomization tests as alternative analysis methods for behavior‐analytic data. Journal of the Experimental Analysis of Behavior, 111(2), 309–328. https://doi.org/10.1002/jeab.500
  • Dallery, J., & Raiff, B. R. (2014). Optimizing behavioral health interventions with single-case designs: From development to dissemination. Translational Behavioral Medicine, 4(3), 290–303. https://doi.org/10.1007/s13142-014-0258-z
  • Darlington, R. B., & Hayes, A. F. (2000). Combining independent p values: Extensions of the Stouffer and binomial methods. Psychological Methods, 5(4), 496–515. https://doi.org/10.1037/1082-989X.5.4.496
  • Declercq, L., Cools, W., Beretvas, S. N., Moeyaert, M., Ferron, J. M., & Van den Noortgate, W. (2020). MultiSCED: A tool for (meta-)analyzing single-case experimental data with multilevel modeling. Behavior Research Methods, 52(1), 177–192. https://doi.org/10.3758/s13428-019-01216-2
  • Delfs, C. H., & Campbell, J. M. (2010). A quantitative synthesis of developmental disability research: The impact of functional assessment methodology on treatment effectiveness. The Behavior Analyst Today, 11(1), 4–19. http://doi.org/10.1037/h0100685
  • Edgington, E. S. (1972). An additive method for combining probability values from independent experiments. The Journal of Psychology, 80(2), 351–363. https://doi.org/10.1080/00223980.1972.9924813
  • Edgington, E. S. (1975). Randomization tests for one-subject operant experiments. The Journal of Psychology, 90(1), 57–68. https://doi.org/10.1080/00223980.1975.9923926
  • Faith, M. S., Allison, D. B., & Gorman, D. B. (1996). Meta-analysis of single-case research. In R. D. Franklin, D. B. Allison, & B. S. Gorman (Eds.), Design and analysis of single-case research (pp. 245–277). Erlbaum.
  • Ferron, J., Rohrer, L. L., & Levin, J. R. (2019). Randomization procedures for changing criterion designs. Behavior Modification, 014544551984762. https://doi.org/10.1177/0145445519847627
  • Flippin, M., Reszka, S., & Watson, L. R. (2010). Effectiveness of the picture exchange communication system (PECS) on communication and speech for children with autism spectrum disorder: A meta-analysis. American Journal of Speech- Language Pathology, 19(2), 178–195. http://doi.org/10.1044/1058-0360(2010/09-0022)
  • Gage, N. A., & Lewis, T. J. (2014). Hierarchical linear modeling meta-analysis of single-subject design research. The Journal of Special Education, 48(1), 3–16. https://doi.org/10.1177/0022466912443894
  • Ganz, J. B., & Ayres, K. M. (2018). Methodological standards in single-case experimental design: Raising the bar. Research in Developmental Disabilities, 79(1), 3–9. https://doi.org/10.1016/j.ridd.2018.03.003
  • Garwood, J. D., Brunsting, N. C., & Fox, L. C. (2014). Improving reading comprehension and fluency outcomes for adolescents with emotional-behavioral disorders: Recent research synthesized. Remedial and Special Education, 35(3), 181–194. https://doi.org/10.1177/0741932513514856
  • Graham, J. E., Karmarkar, A. M., & Ottenbacher, K. J. (2012). Small sample research designs for evidence-based rehabilitation: Issues and methods. Archives of Physical Medicine and Rehabilitation, 93(8), S111–S116. https://doi.org/10.1016/j.apmr.2011.12.017
  • Hammond, D., & Gast, D. L. (2010). Descriptive analysis of single subject research designs: 1983-2007. Education and Training in Autism and Developmental Disabilities, 45(2), 187–202.
  • Hart, S. L., & Banda, D. R. (2010). Picture exchange communication system with individuals with developmental disabilities: A meta-analysis of single subject studies. Remedial and Special Education, 31(6), 476–488. https://doi.org/10.1177/0741932509338354
  • Hartmann, D. P., & Hall, R. V. (1976). The changing criterion design. Journal of Applied Behavior Analysis, 9(4), 527–532. https://doi.org/10.1901/jaba.1976.9-527
  • Heinicke, M. R., & Carr, J. E. (2014). Applied behavior analysis in acquired brain injury rehabilitation: A meta-analysis of single-case design intervention research. Behavioral Interventions, 29(2), 77–105. http://doi.org/10.1002/bin
  • Hernan, C. J., Collins, T. A., Morrison, J. Q., & Kroeger, S. D. (2019). Decreasing inappropriate use of mobile devices in urban high school classrooms: Comparing an antecedent intervention with and without the good behavior game. Behavior Modification, 43(3), 439–463. https://doi.org/10.1177/0145445518764343
  • Heyvaert, M., Moeyaert, M., Ugille, M., Van Den Noortgate, W., & Onghena, P. (2014). Comparing hierarchical linear models and randomization tests in the analysis of multiple baseline data. Paper presented at the Annual Meeting of the American Educational Research Association.
  • Heyvaert, M., Saenen, L., Maes, B., & Onghena, P. (2015). Comparing the percentage of non-overlapping data approach and the hierarchical linear modeling approach for synthesizing single-case studies in autism research. Research in Autism Spectrum Disorders, 11, 112–125. https://doi.org/10.1016/j.rasd.2014.12.002
  • Holcombe, A., Wolery, M., & Gast, D. L. (1994). Comparative single subject research: Description of designs and discussion of problems. Topics in Early Childhood and Special Education, 16(1), 168–190. https://doi.org/10.1177/027112149401400111
  • Huitema, B. E., & McKean, J. W. (2000). Design specification issues in time-series intervention models. Educational and Psychological Measurement, 60(1), 38–58. https://doi.org/10.1177/00131640021970358
  • Jamshidi, L., Declercq, L., Fernández-Castilla, B., Ferron, J. M., Moeyaert, M., Beretvas, S. N., & Van den Noortgate, W. (2021). Bias adjustment in multilevel meta-analysis of standardized single-case experimental data. The Journal of Experimental Education, 89(2), 344–361. https://doi.org/10.1080/00220973.2019.1658568
  • Jamshidi, L., Heyvaert, M., Declercq, L., Fernández Castilla, B., Ferron, J. M., Moeyaert, M., Beretvas, S. N., Onghena, P., & Van den Noortgate, W. (2018). Methodological quality of meta-analyses of single-case experimental studies. Research in Developmental Disabilities, 79, 97–115. https://doi.org/10.1016/j.ridd.2017.12.016
  • Joo, S.-H., Wang, Y., Ferron, J., Beretvas, S. N., Moeyaert, M., & Van Den Noortgate, W. (2021). Comparison of Within- and between-series effect estimates in the meta-analysis of multiple baseline studies. Journal of Educational and Behavioral Statistics, 107699862110355. https://doi.org/10.3102/10769986211035507
  • Kazdin, A. E. (1989). Behavior modification in applied settings (4th ed.). Brooks/Cole Publishing Company.
  • Klein, L. A., Houlihan, D., Vincent, J. L., & Panahon, C. J. (2017). Best practices in utilizing the changing criterion design. Behavior Analysis in Practice, 10(1), 52–61. https://doi.org/10.1007/s40617-014-0036-x
  • Konrad, M., Fowler, C. H., Walker, A. R., Test, D. W., & Wood, W. M. (2007). Effects of self-determination interventions on the academic skills of students with learning disabilities. Learning Disability Quarterly, 30(2), 89–113. https://doi.org/10.2307/30035545
  • Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-case designs technical documentation. What Works Clearinghouse. https://ies.ed.gov/ncee/wwc/Docs/ReferenceResources/wwc_scd.pdf
  • Kratochwill, T. R., & Levin, J. R. (1980). On the applicability of various data analysis procedures to the simultaneous and alternating treatment designs in behavior therapy research. Behavioral Assessment, 2(4), 353–360.
  • Lane, J. D., & Gast, D. L. (2014). Visual analysis in single case experimental design studies: Brief review and guidelines. Neuropsychological Rehabilitation, 24(3–4), 445–463. https://doi.org/10.1080/09602011.2013.815636
  • Lanovaz, M., Cardinal, P., & Francis, M. (2019). Using a visual structured criterion for the analysis of alternating-treatment designs. Behavior Modification, 43(1), 115–131. https://doi.org/10.1177/0145445517739278
  • Ledford, J. R., Barton, E. E., Severini, K. E., & Zimmerman, K. N. (2019). A primer on single-case research designs: Contemporary use and analysis. American Journal on Intellectual and Developmental Disabilities, 124(1), 35–56. https://doi.org/10.1352/1944-7558-124.1.35
  • Ledford, J. R., & Gast, D. L. (2018). Combination and other designs. In J. R. Ledford & D. L. Gast (Eds.), Single case research methodology: Applications in special education and behavioral sciences (3rd ed., pp. 335–364). Routledge.
  • Ledford, J. R., Lane, J. D., & Severini, K. E. (2018). Systematic use of visual analysis for assessing outcomes in single case design studies. Brain Impairment, 19(1), 4–17. https://doi.org/10.1017/BrImp.2017.16
  • Levin, J. R., Ferron, J. M., & Gafurov, B. S. (2018). Comparison of randomization-test procedures for single-case multiple-baseline designs. Developmental Neurorehabilitation, 21(5), 290–311. https://doi.org/10.1080/17518423.2016.1197708
  • Levin, J. R., Ferron, J. M., & Kratochwill, T. R. (2012). Nonparametric statistical tests for single-case systematic and randomized ABAB … AB and alternating treatment intervention designs: New developments, new directions. Journal of School Psychology, 50(5), 599–624. https://doi.org/10.1016/j.jsp.2012.05.001
  • Ma, -H.-H. (2006). An alternative method for quantitative synthesis of single-subject Researches. Behavior Modification, 30(5), 598–617. https://doi.org/10.1177/0145445504272974
  • Maggin, D. M., Briesch, A. M., Chafouleas, S. M., Ferguson, T. D., & Clark, C. (2014). A comparison of rubrics for identifying empirically supported practices with single-case research. Journal of Behavioral Education, 23(2), 287–311. https://doi.org/10.1007/s10864-013-9187-z
  • Manolov, R. (2018). Quantifying overall differences for several AB comparisons. Web-based application on January 9, 2020. https://manolov.shinyapps.io/SeveralAB/
  • Manolov, R., & Onghena, P. (2018). Analyzing data from single-case alternating treatments designs. Psychological Methods, 23(3), 480–504. https://doi.org/10.1037/met0000133
  • Manolov, R., Solanas, A., & Sierra, V. (2020). Changing criterion designs: Integrating methodological and data analysis recommendations. The Journal of Experimental Education, 88(2), 335–350. https://doi.org/10.1080/00220973.2018.1553838
  • Manolov, R., & Vannest, K. (2019). A visual aid and objective rule encompassing the data features of visual analysis. Behavior Modification, 014544551985432. https://doi.org/10.1177/0145445519854323
  • McDaniel, S. C., & Bruhn, A. L. (2016). Using a changing-criterion design to evaluate the effects of check-in/check-out with goal modification. Journal of Positive Behavior Interventions, 18(4), 197–208. https://doi.org/10.1177/1098300715588263
  • McDougall, D. (2005). The range‐bound changing criterion design. Behavioral Interventions, 20(2), 129–137. https://doi.org/10.1002/bin.189
  • McDougall, D., Hawkins, J., Brady, M., & Jenkins, A. (2006). Recent innovations in the changing criterion design: Implications for research and practice in special education. The Journal of Special Education, 40(1), 2–15. https://doi.org/10.1177/00224669060400010101
  • Miller, F. G. (2011). Do functional behavioral assessment improve intervention effectiveness for students with ADHD? A single-subject meta-analysis [Doctoral dissertation]. Pennsylvania State University. https://etda.libraries.psu.edu/files/final_submissions/1936
  • Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2013). The three-level synthesis of standardized single-subject experimental data: A Monte Carlo simulation study. Multivariate Behavioral Research, 48(5), 719–748. https://doi.org/10.1080/00273171.2013.816621
  • Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2014). The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-case experimental designs research. Behavior Modification, 38(5), 665–704. https://doi.org/10.1177/0145445514535243
  • Moeyaert, M., Ugille, M., Ferron, J. M., Onghena, P., Heyvaert, M., Beretvas, S. N., & Van den Noortgate, W. (2015). Estimating intervention effects across different types of single-subject experimental designs: Empirical illustration. School Psychology Quarterly, 30(1), 50–63. http://doi.org/10.1037/spq0000068
  • Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Annals of Internal Medicine, 151(4), 264–269. https://doi.org/10.7326/0003–4819-151-4-200908180-00135
  • Morgan, P. L., & Sideridis, G. D. (2006). Contrasting the effectiveness of fluency interventions for students with or at risk for learning disabilities: A multilevel random coefficient modeling meta-analysis. Learning Disabilities Research & Practice, 21(4), 191–210. https://doi.org/10.1111/j.1540-5826.2006.00218.x
  • Morgan, P. L., Sideridis, G., & Hua, Y. (2012). Initial and over-time effects of fluency interventions for students with or at risk for disabilities. The Journal of Special Education, 46(2), 94–116. https://doi.org/10.1177/0022466910398016
  • Odom, S. L., Barton, E. E., Reichow, B., Swaminathan, H., & Pustejovsky, J. E. (2018). Between-case standardized effect size analysis of single case designs: Examination of the two methods. Research in Developmental Disabilities, 79(1), 88–96. https://doi.org/10.1016/j.ridd.2018.05.009
  • Olive, M. L., & Franco, J. H. (2008). (Effect) size matters: And so does the calculation. The Behavior Analyst Today, 9(1), 5–10. https://doi.org/10.1037/h0100642
  • Onghena, P. (1992). Randomization tests for extensions and variations of ABAB single-case experimental designs: A rejoinder. Behavioral Assessment, 14(2), 153–171.
  • Onghena, P., & Edgington, E. S. (1994). Randomization tests for restricted alternating treatments designs. Behaviour Research and Therapy, 32(7), 783–786. https://doi.org/10.1016/0005-7967(94)90036-1
  • Onghena, P., & Edgington, E. S. (2005). Customization of pain treatments: Single-case design and analysis. Clinical Journal of Pain, 21(1), 56–68. https://doi.org/10.1097/00002508-200501000-00007
  • Onghena, P., Michiels, B., Jamshidi, L., Moeyaert, M., & Van den Noortgate, W. (2018). One by one: Accumulating evidence by using meta-analytical procedures for single-case experiments. Brain Impairment, 19(1), 33–58. https://doi.org/10.1017/BrImp.2017.25
  • Onghena, P., Tanious, R., De, T. K., & Michiels, B. (2019). Randomization tests for changing criterion designs. Behaviour Research and Therapy, 117, 18–27. https://doi.org/10.1016/j.brat.2019.01.005
  • Owens, C. M., & Ferron, J. M. (2012). Synthesizing single-case studies: A Monte Carlo examination of a three-level meta-analytic model. Behavior Research Methods, 44(3), 795–805. https://doi.org/10.3758/s13428-011-0180-y
  • Parker, R. I., Cryer, J., & Byrns, G. (2006). Controlling baseline trend in single-case research. School Psychology Quarterly, 21(4), 418–443. https://doi.org/10.1037/h0084131
  • Parker, R. I., Hagan-Burke, S., & Vannest, K. J. (2007). Percentage of all non-overlapping data: An alternative to PND. The Journal of Special Education, 40(4), 194–204. https://doi.org/10.1177/00224669070400040101
  • Parker, R. I., & Vannest, K. J. (2009). An improved effect size for single-case research: Nonoverlap of all pairs. Behavior Therapy, 40(4), 357–367. https://doi.org/10.1016/j.beth.2008.10.006
  • Parker, R. I., Vannest, K. J., & Brown, L. (2009). The improvement rate difference for single-case research. Exceptional Children, 75(2), 135–150. https://doi.org/10.1177/001440290907500201
  • Parker, R. I., Vannest, K. J., Davis, J. L., & Sauber, S. B. (2011). Combining nonoverlap and trend for single-case research: Tau-U. Behavior Therapy, 42(2), 284–299. https://doi.org/10.1016/j.beth.2010.08.006
  • Preston, D., & Carter, M. (2009). A review of the efficacy of the picture exchange communication system intervention. Journal of Autism and Developmental Disorders, 39(10), 1471–1486. https://doi.org/10.1007/s10803-009-0763-y
  • Pustejovsky, J. E. (2018). Using response ratios for meta-analyzing single-case designs with behavioral outcomes. Journal of School Psychology, 68(Jun), 99–112. https://doi.org/10.1016/j.jsp.2018.02.003
  • Pustejovsky, J. E., & Ferron, J. M. (2017). Research synthesis and meta-analysis of single-case designs. In J. M. Kauffman, D. P. Hallahan, & P. C. Pullen (Eds.), Handbook of special education (2nd ed., pp. 168–186). Routledge.
  • Pustejovsky, J. E., Swan, D. M., & English, K. W. (2019). An examination of measurement procedures and characteristics of baseline outcome data in single-case research. Behavior Modification, 014544551986426. https://doi.org/10.1177/0145445519864264
  • Reichow, B., Barton, E. E., & Maggin, D. M. (2018). Development and applications of the single-case design risk of bias tool for evaluating single-case design research study reports. Research in Developmental Disabilities, 79(1), 53–64. https://doi.org/10.1016/j.ridd.2018.05.008
  • Riley-Tillman, T. C., Burns, M. K., & Kilgus, S. P. (2020). Evaluating educational interventions: Single-case design for measuring response to intervention (2nd ed.). The Guilford Press.
  • Rogers, L. A., & Graham, S. (2008). A meta-analysis of single subject design writing intervention research. Journal of Educational Psychology, 100(4), 879–906. https://doi.org/10.1037/0022-0663.100.4.879
  • Romeiser-Logan, L., Hickman, R. R., Harris, S. R., & Heriza, C. B. (2008). Single-subject research design: Recommendations for levels of evidence and quality rating. Developmental Medicine & Child Neurology, 50(2), 99–103. https://doi.org/10.1111/j.1469-8749.2007.02005.x
  • Rosenthal, R. (1978). Combining results of independent studies. Psychological Bulletin, 85(1), 185–193. https://doi.org/10.1037/0033-2909.85.1.185
  • Roth, M. E., Gillis, J. M., & DiGennaro Reed, F. D. (2014). A meta-analysis of behavioral interventions for adolescents and adults with autism spectrum disorders. Journal of Behavioral Education, 23(2), 258–286. https://doi.org/10.1007/s10864-013-9189-x
  • Schlosser, R. W., Belfiore, P. J., Sigafoos, J., Briesch, A. M., & Wendt, O. (2018). Appraisal of comparative single-case experimental designs for instructional interventions with non-reversible target behaviors: Introducing the CSCEDARS (“cedars”). Research in Developmental Disabilities, 79, 33–52. https://doi.org/10.1016/j.ridd.2018.04.028
  • Schlosser, R. W., Lee, D. L., & Wendt, O. (2008). Application of the percentage of non-overlapping data (PND) in systematic reviews and meta-analyses: A systematic review of reporting characteristics. Evidence-Based Communication Assessment and Intervention, 2(3), 163–187. https://doi.org/10.1080/17489530802505412
  • Scotti, J. R., Evans, I. M., Meyer, L. H., & Walker, P. (1991). A meta-analysis of intervention research with problem behavior: Treatment validity and standards of practice. American Journal on Mental Retardation, 96(3), 233–256.
  • Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). The quantitative synthesis of single-subject research: Methodology and validation. Remedial and Special Education, 8(2), 24–33. https://doi.org/10.1177/074193258700800206
  • Scruggs, T. E., Mastropieri, M. A., Forness, S. R., & Kavale, K. A. (1988). Early language intervention: A quantitative synthesis of single-subject research. The Journal of Special Education, 22(3), 259–283. https://doi.org/10.1177/002246698802200301
  • Shadish, W. R., Hedges, L. V., & Pustejovsky, J. E. (2014). Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: A primer and applications. Journal of School Psychology, 52(2), 123–147. https://doi.org/10.1016/j.jsp.2013.11.005
  • Shadish, W. R., Kyse, E. N., & Rindskopf, D. M. (2013). Analyzing data from single-case designs using multilevel models: New applications and some agenda items for future research. Psychological Methods, 18(3), 385–405. https://doi.org/10.1037/a0032964
  • Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43(4), 971–980. https://doi.org/10.3758/s13428-011-0111-y
  • Shadish, W. R., Zelinsky, N. A. M., Vevea, J. L., & Kratochwill, T. R. (2016). A survey of publication practices of single-case design researchers when treatments have small or large effects. Journal of Applied Behavior Analysis, 49(3), 656–673. https://doi.org/10.1002/jaba.308
  • Sharp, W. G., Jaquess, D. L., Morton, J. F., & Herzinger, C. V. (2010). Pediatric feeding disorders: A quantitative synthesis of treatment outcomes. Clinical Child and Family Psychology Review, 13(4), 348–365. https://doi.org/10.1007/s10567-010-0079-7
  • Shea, B. J., Grimshaw, J. M., Wells, G. A., Boers, M., Andersson, N., Hamel, C., Porter, A. C., Tugwell, P., Moher, D., & Bouter, L. M. (2007). Development of AMSTAR: A measurement tool to assess the methodological quality of systematic reviews. BMC Medical Research Methodology, 7(1), 10. https://doi.org/10.1186/1471-2288-7-10
  • Shepley, C., Ault, M. J., Ortiz, K., Vogler, J. C., & McGee, M. (2020). An exploratory analysis of quality indicators in adapted alternating treatments designs. Topics in Early Childhood Special Education, 39(4), 226–237. https://doi.org/10.1177/0271121418820429
  • Smith, J. D. (2012). Single-case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17(4), 510–550. https://doi.org/10.1037/a0029312
  • Strube, M. J., & Miller, R. H. (1986). Comparison of power rates for combined probability procedures: A simulation study. Psychological Bulletin, 99(3), 407–415. https://doi.org/10.1037/0033-2909.99.3.407
  • Tanious, R., & Onghena, P. (2021). A systematic review of applied single-case research published between 2016 and 2018: Study designs, randomization, data aspects, and data analysis. Behavior Research Methods, 53(4), 1371–1384. https://doi.org/10.3758/s13428-020-01502-4
  • Tate, R. L., Perdices, M., Rosenkoetter, U., Wakima, D., Godbee, K., Togher, L., & McDonald, S. (2013). Revision of a method quality rating scale for single-case experimental designs and n-of-1 trials: The 15-item risk of bias in N-of-1 trials (RoBiNT) scale. Neuropsychological Rehabilitation, 23(5), 619–638. https://doi.org/10.1080/09602011.2013.824383
  • Ugille, M., Moeyaert, M., Beretvas, S. N., Ferron, J., & Van den Noortgate, W. (2012). Multilevel meta-analysis of single-subject experimental designs: A simulation study. Behavior Research Methods, 44(4), 1244–1254. https://doi.org/10.3758/s13428-012-0213-1
  • Ugille, M., Moeyaert, M., Beretvas, S. N., Ferron, J., & Van den Noortgate, W. (2014). Bias corrections for standardized effect size estimates used with single-subject experimental designs. The Journal of Experimental Education, 82(3), 358–374. https://doi.org/10.1080/00220973.2013.813366
  • Van den Noortgate, W., & Onghena, P. (2003). Hierarchical linear models for the quantitative integration of effect sizes in single-case research. Behavior Research Methods, Instruments, & Computers, 35(1), 1–10. https://doi.org/10.3758/BF03195492
  • Van den Noortgate, W., & Onghena, P. (2008). A multilevel meta-analysis of single-subject experimental design studies. Evidence-Based Communication Assessment and Intervention, 2(3), 142–151. https://doi.org/10.1080/17489530802505362
  • Walker, V. L., & Snell, M. E. (2013). Effects of augmentative and alternative communication on challenging behavior: A meta-analysis. Augmentative and Alternative Communication, 29(2), 117–131. https://doi.org/10.3109/07434618.2013.785020
  • Wery, J. J. (2013). Interventions employed with students with emotional disturbance: A meta-analysis [doctoral dissertation]. The Graduate Faculty of North Carolina State University. https://repository.lib.ncsu.edu/bitstream/handle/1840.16/8292/etd.pdf
  • What Works Clearinghouse. (2020). What works clearinghouse standards handbook, Version 4.1. U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance.
  • Wolery, M., Gast, D. L., & Hammond, D. (2010). Comparative intervention designs. In D. L. Gast (Ed.), Single subject research methodology in behavioral sciences (pp. 329–381). Routledge.
  • Wolery, M., Gast, D. L., & Ledford, J. R. (2018). Comparative designs. In J. R. Ledford & D. L. Gast (Eds.), Single case research methodology: Applications in special education and behavioral sciences (3rd ed., pp. 283–334). Routledge.
  • Zhang, J., & Wheeler, J. J. (2011). A meta-analysis of peer-mediated interventions for young children with autism spectrum disorders. Education and Training in Autism and Developmental Disabilities, 46(1), 62–77.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.