116
Views
1
CrossRef citations to date
0
Altmetric
Research Article

A systematic comparison of non-overlap metrics and visual analysis in single-case experimental design

ORCID Icon & ORCID Icon

References

  • Beretvas, S. N., & Hyewon, C. (2008). A review of meta-analyses of single-subject experimental designs: Methodological issues and practice. Evidence-Based Communication Assessment and Intervention, 2(3), 129–141. https://doi.org/10.1080/17489530802446302
  • Bobrovitz, C. D., & Ottenbacher, K. J. (1998). Comparison of visual inspection and statistical analysis of single-subject data in rehabilitation research. American Journal of Physical Medicine & Rehabilitation, 77(2), 94–102. https://doi.org/10.1097/00002060-199803000-00002
  • Brossart, D. F., Parker, R. I., Olson, E. A., & Mahadevan, L. (2006). The relationship between visual analysis and five statistical analyses in a simple AB single-case research design. Behavior Modification, 30(5), 531–563. https://doi.org/10.1177/0145445503261167
  • Brossart, D. F., Vannest, K. J., Davis, J. L., & Patience, M. A. (2014). Incorporating nonoverlap metrics with visual analysis for quantifying intervention effectiveness in single-case experimental designs. Neuropsychological Rehabilitation, 24(3/4), 464–491. https://doi.org/10.1080/09602011.2013.868361
  • Busk, P. L., & Serlin, R. C. (1992). Meta-analysis for single-case research. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case research design and analysis: New directions for psychology and education (pp. 187–212). Lawrence Erlbaum Associates, Inc.
  • Carlin, M. T., & Costello, M. S. (2018). Development of a distance-based effect size metric for single-case research: Ratio of distances. Behavior Therapy, 49(6), 981–994. https://doi.org/10.1016/j.beth.2018.02.005
  • Chen, M., Hyppa-Martin, J., Reichle, J. E., & Symons, F. J. (2016). Comparing single case design overlap-based effect size metrics from studies examining speech generating device interventions. American Journal on Intellectual and Developmental Disabilities, 121(3), 169–193. https://doi.org/10.1352/1944-7558-121.3.169
  • Christ, T. J., Nelson, P. M., Van Norman, E. R., Chafouleas, S. M., & Riley-Tillman, T. C. (2014). Direct behavior rating: An evaluation of time-series interpretations as consequential validity. School Psychology Quarterly, 29(2), 157–170. https://doi.org/10.1037/spq0000029
  • Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37e46. https://doi.org/10.1177/001316446002000104
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Lawrence Erlbaum Associates.
  • Cohen, R. J., & Swerdlik, M. E. (2005). Psychological testing and assessment: An introduction to tests and measurement. McGraw-Hill.
  • Danov, S. E., & Symons, F. J. (2008). A survey evaluation of the reliability of visual inspection and functional analysis graphs. Behavior Modification, 32(6), 828–839. https://doi.org/10.1177/0145445508318606
  • Dart, E. H., & Radley, K. C. (2017). The impact of ordinate scaling on the visual analysis of single-case data. Journal of School Psychology, 63, 105–118. https://doi.org/10.1016/j.jsp.2017.03.008
  • Ferron, J., Goldstein, H., Olszewski, A., & Rohrer, L. (2020). Indexing effects in single-case experimental designs by estimating the percent of goal obtained. Evidence-Based Communication Assessment and Intervention, 14(1–2), 6–27. https://doi.org/10.1080/17489539.2020.1732024
  • Fingerhut, J., Xu, X., & Moeyaert, M. (2021). Impact of within-case variability on Tau-U indices and the hierarchical linear modeling approach for multiple-baseline design data: A Monte Carlo simulation study. Evidence-Based Communication Assessment and Intervention, 15(3), 115–141. https://doi.org/10.1080/17489539.2021.1933727
  • Fisher, W. W., Kelley, M. E., & Lomas, J. E. (2003). Visual aids and structured criteria for improving visual inspection and interpretation of single‐case designs. Journal of Applied Behavior Analysis, 36(3), 387–406. https://doi.org/10.1901/jaba.2003.36-387
  • Franklin, R. D., & Allison, D. B., & Gorman, B. S. (Eds.). (2014). Design and analysis of single-case research. Erlbaum.
  • Gingerich, W. J. (1984). Meta-analysis of applied time-series data. The Journal of Applied Behavioral Science, 20(1), 71–79. https://doi.org/10.1177/002188638402000113
  • Hartmann, P., Straetmans, S., & Vries, C. G. D. (2004). Asset market linkages in crisis periods. The Review of Economics and Statistics, 86(1), 313–326. https://doi.org/10.1162/003465304323023831
  • Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2012). A standardized mean difference effect size for single case designs. Research Synthesis Methods, 3(3), 224–239. https://doi.org/10.1002/jrsm.1052
  • Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2013). A standardized mean difference effect size for multiple baseline designs across individuals. Research Synthesis Methods, 4(4), 324–341. https://doi.org/10.1002/jrsm.1086
  • Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use of single-subject research to identify evidence-based practice in special education. Exceptional Children, 71(2), 165–179. https://doi.org/10.1177/001440290507100203
  • James, I. A., Smith, P. S., & Milne, D. (1996). Teaching visual analysis of time series data. Behavioural and Cognitive Psychotherapy, 24(3), 247–261. https://doi.org/10.1017/S1352465800015101
  • Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings. Oxford University Press.
  • Kent-Walsh, J., Murza, K. A., Malani, M. D., & Binger, C. (2015). Effects of communication partner instruction on the communication of individuals using AAC: A meta-analysis. Augmentative and Alternative Communication, 31(4), 271–284. https://doi.org/10.3109/07434618.2015.1052153
  • Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34(1), 26–38. https://doi.org/10.1177/0741932512452794
  • Kratochwill, T. R., & Levin, J. R. (Eds.). (2014). Single-case intervention research: Methodological and statistical advances. American Psychological Association. https://doi.org/10.1037/14376-000
  • Ledford, J. R., & Gast, D. L. (2018). Single subject research methodology: Applications in special education and behavioral sciences. Routledge.
  • Levin, J. R., Kratochwill, T. R., & Ferron, J. M. (2019). Randomization procedures in single‐case intervention research contexts: (Some of) “the rest of the story. Journal of the Experimental Analysis of Behavior, 112(3), 334–348. https://doi.org/10.1002/jeab.558
  • Lieberman, R. G., Yoder, P. J., Reichow, B., & Wolery, M. (2010). Visual analysis of multiple baseline across participants graphs when change is delayed. School Psychology Quarterly, 25(1), 28–44. https://doi.org/10.1037/a0018600
  • Lusted, L. B. (1971). Decision-making studies in patient management. The New England Journal of Medicine, 284(8), 416–424. https://doi.org/10.1056/NEJM197102252840805
  • Machalicek, W., & Horner, R. H. (2018). Special issue on advances in single-case research design and analysis. Developmental Neurorehabilitation, 21(4), 209–211. https://doi.org/10.1080/17518423.2018.1468600
  • Machalicek, W., O’Reilly, M. F., Beretvas, N., Sigafoos, J., Lancioni, G., Sorrells, A., Rispoli, M. J., & Rispoli, M. (2008). A review of school-based instructional interventions for students with autism spectrum disorders. Research in Autism Spectrum Disorders, 2(3), 395–416. https://doi.org/10.1016/j.rasd.2007.07.001
  • Maggin, D. M., O’Keeffe, B. V., & Johnson, A. H. (2011). A quantitative synthesis of methodology in the meta-analysis of single-subject research for students with disabilities: 1985-2009. Exceptionality: The Official Journal of the Division for Research of the Council for Exceptional Children, 19(2), 109–135. https://doi.org/10.1080/09362835.2011.565725
  • Manolov, R., Jamieson, M., Evans, J. J., & Sierra, V. (2015). Probability and visual aids for assessing intervention effectiveness in single-case designs: A field test. Behavior Modification, 39(5), 691–720. https://doi.org/10.1177/0145445515593512
  • Manolov, R., & Solanas, A. (2009). Percentage of non-overlapping corrected data. Behavior Research Methods, 41(4), 1262–1271. https://doi.org/10.3758/brm.41.4.1262
  • Mason, R. A., Ganz, J. B., Parker, R. I., Burke, M. D., & Camargo, S. P. (2012). Moderating factors of video-modeling with other as model: A meta-analysis of single-case studies. Research in Developmental Disabilities, 33(4), 1076–1086. https://doi.org/10.1016/j.ridd.2012.01.016
  • Morales, M., Domínguez, M. L., & Jurado, T. (2001). The influence of graphic techniques in the evaluation of the effectiveness of treatment in time-series design. Quality & Quantity, 35(3), 277–290. https://doi.org/10.1023/A:1010393831820
  • Muller, M. P., Tomlinson, G., Marrie, T. J., Tang, P., McGeer, A., Low, D. E., Detsky, A. S., & Gold, W. L. (2005). Can routine laboratory tests discriminate between severe acute respiratory syndrome and other causes of community-acquired pneumonia? Clinical Infectious Diseases, 40(8), 1079–1086. https://doi.org/10.1086/428577
  • National Professional Development Center on Autism Spectrum Disorders. (2009). Evidence-based practices for children and youth with ASD. Retrieved June 14, 2011, from http://autismpdc.fpg.unc.edu/sites/autismpdc.fpg.unc.edu/files/EBP_Update_Reviewer_Training_printversion.pdf
  • Nelson, P. M., Van Norman, E. R., & Christ, T. J. (2017). Visual analysis among novices: Training and trend lines as graphic aids. Contemporary School Psychology, 21(2), 93–102. https://doi.org/10.1007/s40688-016-0107-9
  • Ninci, J., Vannest, K. J., Willson, V., & Zhang, N. (2015). Interrater agreement between visual analysts of single-case data: A meta-analysis. Behavior Modification, 39(4), 510–541. https://doi.org/10.1177/0145445515581327
  • Normand, M. P., & Bailey, J. S. (2006). The effects of celeration lines on visual data analysis. Behavior Modification, 30(3), 295–314. https://doi.org/10.1177/0145445503262406
  • Ottenbacher, K. (1986). Reliability and accuracy of visually analyzing graphed data from single-subject designs. The American Journal of Occupational Therapy, 40(7), 464–469. https://doi.org/10.5014/ajot.40.7.464
  • Parker, R. I., & Hagan-Burke, S. (2007). Median-based overlap analysis for single case data. Behavior Modification, 31(6), 919–936. https://doi.org/10.1177/0145445507303452
  • Parker, R. I., & Vannest, K. (2009). An improved effect size for single-case research: Non-overlap of all pairs. Behavior Therapy, 40(4), 357–367. https://doi.org/10.1016/j.beth.2008.10.006
  • Parker, R. I., & Vannest, K. J. (2012). Bottom-up analysis of single-case research designs. Journal of Behavioral Education, 21(3), 254–265. https://doi.org/10.1007/s10864-012-9153-1
  • Parker, R. I., Vannest, K. J., & Brown, L. (2009). The improvement rate difference for single-case research. Exceptional Children, 75(2), 135–150. https://doi.org/10.1177/001440290907500201
  • Parker, R. I., Vannest, K. J., & Davis, J. L. (2011). Effect size in single-case research: A review of nine non-overlap techniques. Behavior Modification, 35(4), 303–322. https://doi.org/10.1177/0145445511399147
  • Parker, R. I., Vannest, K. J., Davis, J. L., & Sauber, S. B. (2011). Combining non-overlap and trend for single-case research: Tau-U. Behavior Therapy, 42(2), 284–299. https://doi.org/10.1016/j.beth.2010.08.006
  • Petersen-Brown, S., Karich, A. C., & Symons, F. J. (2012). Examining estimates of effect using non-overlap of all pairs in multiple baseline studies of academic intervention. Journal of Behavioral Education, 21(3), 203–216. https://doi.org/10.1007/s10864-012-9154-0
  • Pustejovsky, J. E. (2018). Using response ratios for meta-analyzing single-case designs with behavioral outcomes. Journal of School Psychology, 68, 99–112. https://doi.org/10.1016/j.jsp.2018.02.003
  • Rakap, S., Snyder, P., & Pasia, C. (2014). Comparison of non-overlap methods for identifying treatment effect in single-subject experimental research. Behavioral Disorders, 39(3), 128–145. https://doi.org/10.1177/019874291303900303
  • Ray, D. C. (2015). Single-case research design and analysis: Counseling applications. Journal of Counseling & Development, 93(4), 394–402. https://doi.org/10.1002/jcad.12037
  • Scruggs, T. E., & Mastropieri, M. A. (2013). PND at 25: Past, present, and future trends in summarizing single-subject research. Remedial and Special Education, 34(1), 9–19. https://doi.org/10.1177/0741932512440730
  • Sen, P. K. (1968). Estimates of the regression coefficient based on Kendall’s Tau. Journal of the American Statistical Association, 63(324), 1379 1389. https://doi.org/10.1080/01621459.1968.10480934
  • Silk, J. (1992). Silk scientific [UN□SCAN□IT]. Orem.
  • Spencer, V. G., Evmenova, A. S., Boon, R. T., & Hayes-Harris, L. (2014). Review of research-based interventions for students with autism spectrum disorders in content area instruction: Implications and considerations for classroom practice. Education and Training in Autism and Developmental Disabilities, 49, 331–353. https://www.jstor.org/stable/23881252
  • Swets, J. A. (1988). Measuring the accuracy of diagnostic systems. Science, 240(4857), 1285–1293. https://doi.org/10.1126/science.3287615
  • Tarlow, K. R. (2016a). An improved rank correlation effect size statistic for single-case designs: Tau. Behavior Modification, 41(4), 427–467. https://doi.org/10.1177/0145445516676750
  • Tarlow, K. R. (2016b). Tau calculator. http://www.ktarlow.com/stats/tau
  • Theil, H. (1950). A rank-invariant method of linear and polynomial regression analysis. Indagationes Mathematicae, 12(85), 173. https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=A+rank-invariant+method+of+linear+and+polynomial+regression+analysis.&btnG=
  • Valentine, J. C., Tanner-Smith, E. E., Pustejovsky, J. E., & Lau, T. S. (2016). Between case standardized mean difference effect sizes for single-case designs: A primer and tutorial using the scdhlm web application. The Campbell Collaboration. https://doi.org/10.4073/cmdp.2016.1
  • Vannest, K. J., & Ninci, J. (2015). Evaluating intervention effects in single‐case research designs. Journal of Counseling & Development, 93(4), 403–411. https://doi.org/10.1002/jcad.12038
  • Vannest, K. J., Parker, R. I., Gonen, O., & Adiguzel, T. (2016). Single case research: Web based calculators for SCR analysis. (Version 2.0) [ Web-based application]. Texas A&M University. http://www.singlecaseresearch.org
  • Vannest, K. J., & Sallese, M. R. (2021). Benchmarking effect sizes in single-case experimental designs. Evidence-Based Communication Assessment and Intervention, 15(3), 1–24. https://doi.org/10.1080/17489539.2021.1886412
  • Wehman, P., Schall, C., Carr, S., Targett, P., West, M., & Cifu, G. (2014). Transition from school to adulthood for youth with autism spectrum disorder: What we know and what we need to know. Journal of Disability Policy Studies, 25(1), 30–40. https://doi.org/10.1177/1044207313518071
  • White, O. R., & Haring, N. G. (1980). Exceptional teaching: A multimedia training package. Merrill Publishing Company.
  • Wolery, M., Busick, M., Reichow, B., & Barton, E. E. (2010). Comparison of overlap methods for quantitatively synthesizing single-subject data. The Journal of Special Education, 44(1), 18–28. https://doi.org/10.1177/0022466908328009
  • Wolfe, K., Dickenson, T. S., Miller, B., & McGrath, K. V. (2019). Comparing visual and statistical analysis of multiple baseline design graphs. Behavior Modification, 43(3), 361–388. https://doi.org/10.1177/0145445518768723
  • Zweig, M. H., & Campbell, G. (1993). Receiver-operating characteristic (ROC) plots: A fundamental evaluation tool in clinical medicine. Clinical Chemistry, 39(8), 561–577. https://doi.org/10.1093/clinchem/39.4.561

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.