297
Views
6
CrossRef citations to date
0
Altmetric
Research Article

Impact of within-case variability on Tau-U indices and the hierarchical linear modeling approach for multiple-baseline design data: A Monte Carlo simulation study

, &

REFERENCES

  • Baek, E. K., & Ferron, J. M. (2013). Multilevel models for multiple-baseline data: Modeling across-participant variation in autocorrelation and residual variance. Behavior Research Methods, 45(1), 65–74. https://doi.org/10.3758/s13428-012-0231-z
  • Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis, 1(1), 91–97. https://doi.org/10.1901/jaba.1968.1-91
  • Barlow, D. H., Nock, M. K., & Hersen, M. (2009). Single-case experimental designs: Strategies for studying behavior change. Allyn and Bacon.
  • Brossart, D. F., Laird, V. C., & Armstrong, T. W. (2018). Interpreting Kendall’s Tau and Tau-U for single-case experimental designs. Cogent Psychology, 5(1), 1–26. https://doi.org/10.1080/23311908.2018.1518687
  • Busk, P. L., & Marascuilo, L. A. (1988). Autocorrelation in single-subject research: A counterargument to the myth of no autocorrelation. Behavioral Assessment, 10(3), 229–242. https://psycnet.apa.org/record/1989-21152-001
  • Center, B. A., Skiba, R. J., & Casey, A. (1985/1986). A methodology for the quantitative synthesis of intra-subject design research. The Journal of Special Education, 19(4), 387–400. https://doi.org/10.1177/002246698501900404
  • Declercq, L., Jamshidi, L., Fernández-Castilla, B., Beretvas, S., Moeyaert, M., Ferron, J., & Van den Noortgate, W. (2019). Analysis of single-case experimental count data using the linear mixed effects model: A simulation study. Behavior Research Methods, 51(6), 2477–2497. https://doi.org/10.3758/s13428-018-1091–y
  • Ferron, J. (2002). Reconsidering the use of the general linear model with single-case data. Behavior Research Methods, Instruments, & Computers, 34(3), 324–331. https://doi.org/10.3758/BF03195459
  • Ferron, J., & Scott, H. (2005). Multiple baseline designs. In B. S. Everittt & D. C. Howell (Eds.), Encyclopedia of statistics in behavioral science (Vol. 3, pp. 1306–1309). John Wiley & Sons.
  • Ferron, J. M., Bell, B. A., Hess, M. F., Rendina-Gobioff, G., & Hibbard, S. T. (2009). Making treatment effect inferences from multiple-baseline data: The utility of multilevel modeling approaches. Behavior Research Methods, 41(2), 372–384.
  • Ferron, J. M., Joo, S. H., & Levin, J. R. (2017). A Monte Carlo evaluation of masked visual analysis in response-guided versus fixed-criteria multiple-baseline designs. Journal of Applied Behavior Analysis, 50(4), 701–716. https://doi.org/10.1002/jaba.410
  • Ferron, J. M., Moeyaert, M., Van den Noortgate, W., & Beretvas, S. N. (2014). Estimating causal effects from multiple-baseline studies: Implications for design and analysis. Psychological Methods, 19(4), 493–510. https://doi.org/10.1037/a0037038
  • Fingerhut, J., Xu, X., & Moeyaert, M. (2021). Development and application of a decision flowchart. Evidence-Based Communication Assessment and Intervention.
  • Gage, N. A., Grasley-Boy, N. M., & MacSuga-Gage, A. S. (2018). Professional development to increase teacher behavior-specific praise: A single-case design replication. Psychology in the Schools, 55(3), 264–277. https://doi.org/10.1002/pits.22106
  • Gast, D. L., & Ledford, J. R. (2014). Single case research methodology: Applications in special education and behavioral sciences. Routledge.
  • Grissom, R. J., & Kim, J. J. (2001). Review of assumptions and problems in the appropriate conceptualization of effect size. Psychological Methods, 6(2), 135–146. https://doi.org/10.1037/1082-989X.6.2.135
  • Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2012). A standardized mean difference effect size for single case designs. Research Synthesis Methods, 3(3), 224–239. https://doi.org/10.1002/jrsm.1052
  • Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2013). A standardized mean difference effect size for multiple-baseline designs across individuals. Research Synthesis Methods, 4(4), 324–341. https://doi.org/10.1002/jrsm.1086
  • Heyvaert, M., Moeyaert, M., Verkempynck, P., Van den Noortgate, W., Vervloet, M., Ugille, M., & Onghena, P. (2017). Testing the intervention effect in single-case experiments: A Monte Carlo simulation study. Journal of Experimental Education, 85(2), 175–196. https://doi.org/10.1080/00220973.2015.1123667
  • Hong, E. R., Gong, L. Y., Ninci, J., Morin, K., Davis, J. L., Kawaminami, S., Shi, Y., & Noro, F. (2017). A meta-analysis of single-case research on the use of tablet-mediated interventions for persons with ASD. Research in Developmental Disabilities, 70, 198–214. https://doi.org/10.1016/j.ridd.2017.09.013
  • Huitema, B. E. (1985). Autocorrelation in applied behavior analysis: A myth. Behavioral Assessment, 7(2), 107–118. https://psycnet.apa.org/record/1986-08129-001
  • Jamshidi, L., Heyvaert, M., Declercq, L., Fernández-Castilla, B., Ferron, J., Moeyaert, M., Beretvas, S. N., Onghena, P., & Van den Noortgate, W. (2021). A systematic review of single-case experimental design meta-analyses: Characteristics of study designs, data, and analyses. Evidence-based Communication Assessment and Intervention.
  • Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings (2nd ed.). Oxford University Press.
  • Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. Annals of Mathematical Statistics, 22(1), 79–86. https://doi.org/10.1214/aoms/1177729694
  • Kuznetsova, A., Brockhoff, P., & Christensen, R. H. B. (2017). lmerTest package: Tests in linear mixed effects models. (Version 3.1-1.) [R Package] https://cran.rproject.org/web/packages/lmerTest/index.html
  • Lambert, M. C., Cartledge, G., Heward, W. L., & Lo, Y. (2006). Effects of response cards on disruptive behavior and academic responding during math lessons by fourth-grade urban students. Journal of Positive Behavior Interventions, 8(2), 88–99. https://doi.org/10.1177/10983007060080020701
  • Ledford, J., & Gast, D. L. (2018). Combination and other designs. In J. R. Ledford & D. L. Gast (Eds.), Single case research methodology (3rd ed., pp. 239–281). Routledge.
  • Lee, J. B., & Cherney, L. R. (2018). Tau: A quantitative approach for analysis of single-case experimental data in aphasia. American Journal of Speech-Language Pathology, 27(1S), 495–503. https://doi.org/10.1044/2017_AJSLP-16-0197
  • Maggin, D. M., Swaminathan, H., Rogers, H. J., O’Keeffe, B. V., Sugai, G., & Horner, R. H. (2011). A generalized least squares regression approach for computing effect sizes in single-case research: Application examples. Journal of School Psychology, 49(3), 301–321. https://doi.org/10.1016/j.jsp.2011.03.004
  • Manolov, R., & Moeyaert, M. (2017). Recommendations for choosing single-case data analytical techniques. Behavior Therapy, 48(1), 97–114. https://doi.org/10.1016/j.beth.2016.04.008
  • McKinney, T., & Vasquez, E., III. (2014). There’s a bug in your ear!: Using technology to increase the accuracy of DTT implementation. Education and Training in Autism and Developmental Disabilities, 49(4), 594–601. https://www.jstor.org/stable/24582354
  • McKnight, S. D., McKean, J. W., & Huitema, B. E. (2000). A double bootstrap method to analyze linear models with autoregressive error terms. Psychological Methods, 5(1), 87–101. https://doi.org/10.1037/1082-989X.5.1.87
  • Moeyaert, M., Akhmedjanova, D., Ferron, J., Beretvas, S., & Van Den Noortgate, W. (2020). Effect size estimation for combined single-case experimental designs. Evidence-based Communication Assessment and Intervention, 14(1–2), 1–24.
  • Moeyaert, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2014). From a single-level analysis to a multilevel analysis of single-case experimental designs. Journal of School Psychology, 52(2), 191–211. https://doi.org/10.1016/j.jsp.2013.11.003
  • Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2013a). Modeling external events in the three-level analysis of multiple-baseline across-participants designs: A simulation study. Behavior Research Methods, 45(2), 547–559. https://doi.org/10.3758/s13428-012-0274-1
  • Moeyaert, M., Ugille, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2013b). The three-level synthesis of standardized single-subject experimental data: A Monte Carlo simulation study. Multivariate Behavioral Research, 48(5), 719–748. https://doi.org/10.1080/00273171.2013.816621
  • Parker, R. I., & Vannest, K. J. (2009). An improved effect size for single case research: Nonoverlap of all pairs (NAP). Behavior Therapy, 40(4), 357–367. https://doi.org/10.1016/j.beth.2008.10.006
  • Parker, R. I., Vannest, K. J., & Brown, L. (2009). The improvement rate difference for single-case research. Exceptional Children, 75(2), 135–150. https://doi.org/10.1177/001440290907500201
  • Parker, R. I., Vannest, K. J., Davis, J. L., & Sauber, S. B. (2011). Combining nonoverlap and trend for single-case research: Tau. Behavior Therapy, 42(2), 284–299. https://doi.org/10.1016/j.beth.2010.08.006
  • Pustejovsky, J. E. (2015). Measurement-comparable effect sizes for single-case studies of free-operant behavior. Psychological Methods, 20(3), 342–359. https://doi.org/10.1037/met0000019
  • Pustejovsky, J. E. (2019). Procedural sensitivities of effect sizes for single-case designs with directly observed behavioral outcome measures. Psychological Methods, 24(2), 217–235. https://doi.org/10.1037/met0000179
  • Pustejovsky, J. E., Hedges, L. V., & Shadish, W. R. (2014). Design-comparable effect sizes in multiple-baseline designs: A general modeling framework. Journal of Educational and Behavioral Statistics, 39(5), 368–393. https://doi.org/10.3102/1076998614547577
  • Pustejovsky, J. E., & Swan, D. M. (2017). SingleCaseES: A calculator for single-case effect size indices, R package version 0.3 [Computer software manual]. https://github.com/jepusto/SingleCaseES.
  • Richardson, J. (2011). Eta squared and partial eta squared as measures of effect size in educational research. Educational Research Review, 6(2), 135–147. https://doi.org/10.1016/j.edurev.2010.12.001
  • Rohatgi, A. (2016). Webplotdigitizer ver.3.10. http://arohatgi.info/WebPlotDigitizer
  • RStudio Team. (2015). RStudio: Integrated development for R. RStudio, Inc. (computer software v0.98.1074) http://www.rstudio.com/
  • Schlosser, R. W., Lee, D. L., & Wendt, O. (2008). Application of the percentage of non-overlapping data in systematic reviews and meta-analyses: A systematic review of reporting characteristics. Evidence-Based Communication Assessment and Intervention, 2(3), 163–187. https://doi.org/10.1080/1748953080250412
  • Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). The quantitative synthesis of single subject research: Methodology and validation. Remedial and Special Education, 8(2), 24–33. https://doi.org/10.1177/074193258700800206
  • Shadish, W. R., Rindskopf, D. M., & Hedges, L. V. (2008). The state of the science in the meta-analysis of single-case experimental designs. Evidence-Based Communication Assessment and Intervention, 2(3), 188–196. https://doi.org/10.1080/17489530802581603
  • Shadish, W. R., Rindskopf, D. M., Hedges, L. V., & Sullivan, K. J. (2013). Bayesian estimates of autocorrelations in single-case designs. Behavior Research Methods, 45(3), 813–821. https://doi.org/10.3758/s13428-012-0282-1
  • Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43(4), 971–980. doi:10.3758/s13428-011-0111-y
  • Smith, J. D. (2012). Single-case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17(4), 510–550. https://doi.org/10.1037/a0029312
  • Solomon, B. G. (2014). Violations of assumptions in school-based single-case data: Implications for the selection and interpretation of effect sizes. Behavior Modification, 38(4), 477–496. https://doi.org/10.1177/0145445513510931
  • Tarlow, K. R. (2017a). An improved rank correlation effect size statistic for single-case designs: Baseline corrected Tau. Behavior Modification, 41(4), 427–467. https://doi.org/10.1177/0145445516676750
  • Tarlow, K. R. (2017b). Baseline corrected Tau for single-case research (R code). http://ktarlow.com/stats
  • Van den Noortgate, W., & Onghena, P. (2003a). Hierarchical linear models for the quantitative integration of effect sizes in single case research. Behavior Research Methods, Instruments & Computers, 35(1), 1–10. https://doi.org/10.3758/BF03195492
  • Van den Noortgate, W., & Onghena, P. (2003b). Multilevel meta-analysis: A comparison with traditional meta-analytical procedures. Educational and Psychological Measurement, 63(5), 765–790. https://doi.org/10.1177/0013164403251027
  • Van den Noortgate, W., & Onghena, P. (2007). The aggregation of single-case results using hierarchical linear models. The Behavior Analyst Today, 8(2), 196–209. http://dx.doi.org/10.1037/h0100613
  • Van den Noortgate, W., & Onghena, P. (2008). A multilevel meta-analysis of single-subject experimental design studies. Evidence-Based Communication Assessment and Intervention, 2(3), 142–158. https://doi.org/10.1080/17489530802505362
  • Vannest, K. J., & Ninci, J. (2015). Evaluating intervention effects in single-case research designs. Journal of Counseling and Development, 93(4), 403–411. https://doi.org/10.1002/jcad.12038
  • Vannest, K. J., Peltier, C., & Haas, A. (2018). Results reporting in single case experiments and single case meta-analysis. Research in Developmental Disabilities, 79, 10–18. https://doi.org/10.1016/j.ridd.2018.04.029
  • Vannest, K. J., & Sallese, M. R. (2021). Benchmarking effect sizes in single-case experimental designs. Evidence-based Communication Assessment and Intervention, 1–24. https://doi.org/10.1080/17489539.2021.1886412
  • What Works Clearinghouse. (2020). What Works Clearinghouse Standards Handbook, Version 4.1. U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. https://ies.ed.gov/ncee/wwc/handbooks

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.