2,928
Views
4
CrossRef citations to date
0
Altmetric
Editorial

Special Issue on Advances in Single-Case Research Design and Analysis

We are pleased to introduce this special issue focused on application of single-case research designs to address the needs of individuals with neurodevelopmental disabilities and behaviour disorders. Single-case research designs incorporate experimental methodology originating from experimental and applied behaviour analysis,Citation1Citation3 and have played a pivotal role over the last several decades in the development of evidence-based practices for individuals with and at risk for neurodevelopmental disability and behavioural disorders.Citation4Citation6 Extensive meta-analyses and systematic literature reviews of single-case research have synthesized the effectiveness of intervention practices for children and youth with intellectual and developmental disabilities,Citation7Citation9 and children and youth with behavioural or academic concernsCitation10,Citation11 as well as across a wide variety of other disabilities or at risk populations, intervention type, adaptive behaviour domains, and settings. Importantly, single-case research designs have provided researchers with a rigorous experimental alternative to nonexperimental AB case series.

Single-case research designs are experimental designs involving one or more participants, or less commonly groups of participants, which are characterized by (a) repeated measurement of observable and measurable dependent variable(s) over time, (b) inclusion of a baseline assessment phase to document the presenting issue prior to intervening, (c) active manipulation of the independent variable, (d) opportunity to assess the effects of independent variable manipulation at least three times, each at a different point in time (i.e., replication), and (d) analysis of intervention effects at the level of the individual participant using visual analysis of line graphs (i.e., analysis of level, variability, trend, overlap, immediacy effect, consistency of effects, and consistency of response patterns across similar phases).

Single-case research designs differ from group experimental designs in several notable ways: (a) the unit of analysis is at the level of the individual participant rather than between groups, (b) although randomization can be used in any single-case research design, it has not been evenly applied across researchers and single-case research is traditionally a response guided approach, (c) the use of visual analysis of line graphs to determine the strength of the functional relation between the independent variable and the dependent variable, and ultimately to reject the researcher’s null hypothesis rather than the use of parametric statistics; and (d) external validity or specifically generalizability is built by systematic and direct replication of intervention methods across participants, studies, and researchers.Citation12

Several long-standing criticisms have arisen from these methodological and conceptual differences including (a) the low interrater reliability of visual analysis in single-case research, particularly when conducted by novice raters,Citation13,Citation14 (b) the autocorrelation of single-case data,Citation15 and the related issues of lack of randomization and the use of a response guided approach,Citation16 (c) the superiority of group experimental design to minimize threats to internal validity,Citation17 and (d) publication bias toward large treatment effects.Citation18,Citation19 Additional unfounded criticism has come from confusion about the differences between single-case research design and nonexperimental AB case series.

Researchers have sought to counter and directly address these criticisms by developing standardized protocols for assessing the credibility of single-case research design and evidence,Citation12,Citation20,Citation21 data analysis, reporting guidelines for publishing single-case research findings,Citation22 and the use of standardized visual analysis practices to determine the strength of the functional relationCitation23,Citation24 and have validated training packages that can be used to train students and scholars in visual analysis.Citation14,Citation25,Citation26 For instance, the United States Department of Education Institute for Education Science (IES) White Paper defines the What Works Clearinghouse Pilot Standards for single-case design and analysis.Citation20 A growing body of literature has contributed to the development of new methods related to applying randomization logic into the analysis of single-case research design and resultsCitation27-Citation29 and calculation of effect size and meta-analyses developmentCitation30 but also opened up important discourse within the applied behaviour analysis and special education fields on such issues as the role of single-case design in building evidence, and on publication bias towards clinically significant results and the role of negative or null results in single-case research.Citation19,Citation31 However, relatively fewer publications have demonstrated the use of advanced design and analysis in the context of educational and clinical intervention research.

This special issue called for submissions that described the application of advanced methodological and statistical procedures to the design and analysis of single-subject evaluations of clinical and educational interventions with individuals with neurodevelopmental disorders. We were pleased to accept six studies that offer researchers with positive exemplars of advances in the use of sophisticated intervention designs including masked visual analysis and randomization designs, and the use of multilevel modeling, and dynamic multilevel modeling that address some of the critical issues facing our respective fields if we are to improve already rigorous single-case research methodology.

Lloyd, Finley, & Weaver report the results of implementing a trial-based functional analysis during reading instruction to identify the consequence maintaining stereotypy for a 13-year old male with autism and attention-deficit hyperactivity disorder and a further analysis of experimental functional analysis data using permutation and randomization tests as well as masked visual analysis in an effort to minimize Type 1 error. The trial-based functional analysis suggested an automatic function for stereotypy, and the results of nonparametric statistical analyses were consistent with visual analysis.

Similarly, Hwang and Levin utilized a novel multiply randomized, replicated crossover-design adaptation of a single-case AB designCitation32,Citation33 to examine the comparative results of two research-based pictorial mnemonic strategies (i.e., one to improve recall of dates, the other to improve vocabulary) to improve child acquisition of these academic targets for eight middle school children receiving special education. The results of this study demonstrated that the mnemonic strategy approach resulted in the anticipated improvements; however, the contributions of this article go well beyond the evaluation of an evidence-based strategy to discuss recommendations and future research for single-case researchers in the use of advanced design and analysis.

Manolov and Solanas offer a technical paper that provides guidelines for data collection and the use and interpretation of standardized and raw average difference indices in single-case research and describes two distinct data analysis options for single-case research designs. The authors used modified mean phase difference,Citation34,Citation35 slope and level change,Citation36 and d-statisticCitation37,Citation38 alongside visual analysis to conduct a meta-analysis of three single-case studies examining the effects of behavioural intervention on eye contact.Citation39Citation41

Chiu and Roberts present a paper which showcases the use of dynamic multilevel analysis (DMA) procedures in the re-analysis of Carr, Moore, & Anderson’sCitation42 meta-analysis of single-case intervention research for individuals with autism. DMA, which incorporates methods including multilevel and time-series analysis, addresses some limitations of other nonoverlap indices and single-case variables in conducting meta-analysis. DMA applies to small datasets, accounts for autocorrelation, can be used to test post-hoc hypotheses about time and participant variables in cases where detailed information about these variables is reported in publications, and can assess whether data should be separated into two or more meta-analyses and can be flexibly applied as different analytic components (e.g., use breakpoint estimation and other time-series components without multilevel analysis. Like Carr and colleague’s review, the findings of the re-analysis include positive results for the use of self-management interventions, but findings also demonstrated additional recency effects, outcome, and intervention differences. Specifically, interventions were more effective for task outcomes, academic skills, and daily living skills when compared to other outcomes. Moreover, interventions with longer duration sessions were more effective than interventions with shorter duration sessions.

Zelinksy and Shadish present a meta-analysis of choice-making interventions that uses the between-case-d-statistic to evaluate the aggregate effectiveness of this intervention for individuals with challenging behaviour. This meta-analysis sought to address the limitations of nonoverlap indices used in an earlier meta-analysis on choice makingCitation43 and update this body of literature. The results of this study confirmed the effectiveness of choice making as a behaviour reduction intervention for individuals who engage in challenging behaviour with the random-effects average of d = 1.02 (standard error of 0.168) with a 95% confidence interval.

The studies showcased in this special issue on advances in single-case research design and statistical analysis inform and improve the efforts of single-case researchers to conduct rigorous experimental evaluation, but we are also optimistic that these findings will also be of interest to the broader neurorehabilitation field across the aims and scope of Neurodevelopmental Rehabilitation, particularly for researchers examining interventions for low-incidence populations or in conjunction with clinical practice where group experimental design is untenable.Citation44

Additional information

Funding

This work was supported by United States Department of Education, Institute of Education Sciences [grant no. R324B160034].

References

  • Watson JB. Behaviorism. New York, NY: Norton; 1925.
  • Barlow DH, Nock MK, Hersen M. Single case research designs: strategies for studying behaviour change. New York, NY: Allyn and Bacon; 2008.
  • Kazdin AE. Single-case research designs: methods for clinical and applied settings. New York, NY: Oxford University Press; 2010.
  • Bailey JS, Burch MR, Burch. Research methods in applied behaviour analysis. 2nd ed. Abingdon, United Kingdom: Routledge; 2017.
  • Kratochwill TR, Levin JR. Single-case intervention research: methodological and statistical advances. Washington, DC: American Psychological Association; 2014.
  • Ledford JR, Gast DL. Single case research methodology. 3rd ed. New York, NY: Routledge; 2018.
  • Gilson CB, Carter EW, Biggs EE. Systematic review of instructional methods to teach employment skills to secondary students with intellectual and development disabilities. Res Practice Persons Severe Dis. 2017;42(2):89–107. doi:10.1177/1540796917698831.
  • Wong C, Odom SL, Hume KA, Cox AW, Fettig A, Kucharczyk S, Brock ME, Plavnick PB, Fleury VP, Schultz TR. Evidence-based practices for children, youth, and young adults with autism spectrum disorder: A comprehensive review. J Autism Dev Disord. 2015;45(7):1951–66. doi:10.1007/s10803-014-2351-z.
  • Snyder SM, Ayres K, Sartini EC, Knight VF, Mims PJ. Single case design elements in text comprehension research for students with developmental disabilities. Edu and Training in Autism and Dev Dis. 2017;2(4):405–21.
  • Codding RS, Matthew K, Lukito G. Meta-analysis of mathematic basic-fact fluency interventions: A component analysis. Learn Res. 2011;26(1):36–47. doi:10.1111/j.1540-5826.2010.00323.x.
  • Morano S, Ruiz S, Hwang JH, Wertalk JL, Moeller J, Karal MA, Malloy A. Meta-analysis of single-case treatment effects on self-injurious behaviour for individuals with autism and intellectual disabilities. Autism Dev Lang Impairments. 2017;(2):1–26. doi:10.1177/2396941516688399journals.sagepub.com/home/dli.
  • Horner RH, Carr EG, Halle J, McGee G, Odom S, Wolery M. The use of single-subject research to identify evidence-based practice in special education. Except Children. 2005;71:165–79. doi:10.1177/001440290507100203.
  • Ninci J, Vannest KJ, Wilson V, Zhang N. Interrater agreeemnt between visual analysts of single-case data: A meta-analysis. Behave Modif. 2015;39(4):510–41. doi:10.1177/0145445515581327.
  • Wolfe K, Seaman MA, Drasgow E. Interrater agreement on the visual analysis of individual tiers and functional relations in multiple baseline designs. Behave Modif. 2016;40(6):852–73. doi:10.1177/0145445516644699.
  • Shadish WR, Kyse EN, Rindskopf DM. Analyzing data from single-case designs using multilevel models: new applications and some agenda items for future research. Psych Methods. 2013;18:385–405. doi:10.1037/a0032964.
  • Joo S, Ferron JM, Moeyaert M, Beretvas N, van den Noortgate W. Approaches for specifying the level-1 error structure when synthesizing single-case data. J Exp Edu. 2017. doi:10.1080/00220973.2017.1409181.
  • Shadish WR, Cook TD, Campbell DT. Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin; 2002.
  • Gage NA, Cook BG, Reichow B. Publication bias in special education meta-analyses. Except Children. 2017;83(4):428–45. doi:10.1177/0014402917691016.
  • Shadish WR, Zelinsky NAM, Vevea JL, Kratochwill TR. A survey of publication practices of single-case design researchers when treatments have small or large effects. J Appl Beh Analysis. 2016;49(3):656–73. doi:10.1002/jaba.308.
  • Kratochwill TR, Hitchcock J, Horner RH, Levin JR, Odom SL, Shadish WR 2010. Single-case designs technical documentation. Retrieved from What Works Clearinghouse http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf.
  • Tate RL, Perdices M, Rosenkoetter U, McDonald S, Togher L, Shadish W, Horner R, Kratochwill T, Barlow DH, Kazdin A, et al. The single-case reporting guideline in behavioural interventions (SCRIBE) 2016 statement. Arch Scientific Psych. 2016;4:1–9. doi:10.1037/arc0000026.
  • Tate R, Perdices M, McDonald S, Togher L, Rosenkoetter U. The design, conduct and report of single-case research: resources to improve the quality of the neurorehabilitation literature. Neuropsych Rehab. 2014;24:3–4. doi:10.1080/09602011.2013.875043.
  • Ledford JR, Lane JD, Severini KE. Systematic use of visual analysis for assessing outcomes in single case design studies. Brain Impairment. 2017;19(1):4–17. doi:10.1017/BrImp.2017.16.
  • Wolfe K, Seaman MA, Drasgow E, Sherlock P. An evaluation of the agreement between the conservative dual-criterion method and expert visual analysis. J Appl Beh Analysis. 2018;51(2):345–251. doi: 10.1002/jaba.453.
  • Wolfe K, Slocom TA. A comparison of two approaches to training visual analysis of AB graphs. J Appl Behav Anal. 2015;48(2):472–77. doi:10.1002/jaba.212.
  • Young ND, Daly EJ. An evaluation of prompting and reinforcement for training visual analysis skills. Journal of Beh Edu. 2016;25:95–119. doi:10.1007/s10864-015-9234-z.
  • Kratochwill TR, Levin JR. Enhancing the scientific credibility of single-case intervention research: randomization to the rescue. Psych Methods. 2010;15(2):124–44. doi:10.1037/a0017736.
  • Levin JR, Evmenova AS, Guforov BS. The single-case data-analysis ExPRT. In: Kratochwill TR, Levin JR, eds. Single-case intervention research: methodological and data-analysis advances. Washington, DC: American Psychological Association; 2014.
  • Levin JR, Ferron JM, Gafurov BS. Additional comparisons of randomization-test procedures for single-case multiple-baseline designs: alternative effect types. J Sch Psychol. 2017;63:13–34. doi:10.1016/j.jsp.2017.02.003.
  • Shadish WR, Hedges LV, Pustejovsky JE. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: A primer and applications. J School Psych. 2014;52:123–47. doi:10.1016/j.jsp.2013.11.005.
  • Kratochwill TR, Levin JR, Horner RH. Negative results: conceptual and methodological dimensions in single-case intervention research. Remedial Spec Edu. 2017;39(2):67–76. doi:10.1177/0741932517741721.
  • Ferron JM, Levin JR. Single-case permutation and randomization statistical tests: present status, promising new developments. In: Kratochwill TR, Levin JR, eds. Single-case intervention research: methodological and data-analysis advances. Washington, DC: American Psychological Association; 2014.
  • Levin JR, Ferron JM, Gafurov BS. Improved randomization tests for a class of single-case intervention designs. J Mod App Stat Methods. 2014;13(2):2–52. doi:10.22237/jmasm/1414814460.
  • Manolov R, Solanas A. A comparison of mean phase difference and generalized least squares for analyzing single-case data. J Sch Psychol. 2013;51(2):201–15. doi:10.1016/j.jsp.2012.12.005.
  • Manolov R, Rochat L. Further developments in summarizing and meta-analyzing single-case data: an illustration with behavioral interventions in acquired brain injury. Neuropsychol Rehabil. 2015;25(5):637–62. doi:10.1080/09602011.2015.1064452.
  • Solanas A, Manolov R, Onghena P. Estimating slope and level change in N = 1 designs. Bet Mod. 2010;34(3):195–218. doi:10.1177/0145445510363306.
  • Hedges LV, Pustejovsky JE, Shadish WR. A standardized mean difference effect size for single case designs. Res Synth Methods. 2012;3(3):224–39. doi:10.1002/jrsm.1052.
  • Hedges LV, Pustejovsky JE, Shadish WR. A standardized mean difference effect size for multiple baseline designs across individuals. Res Synthesis Methods. 2013;4(4):324–41. doi:10.1002/jrsm.1086.
  • Foxx RM, Azrin NH. The elimination of autistic self-stimulatory behaviour by overcorrection. J Appl Behave Anal. 1973;6(1):1–14. doi:10.1901/jaba.1973.6-1.
  • Foxx RM. Attention training: the use of overcorrection avoidance to increase the eye contact of autistic and retarded children. J Appl Behave Anal. 1977;10(3):489–99. doi:10.1901/jaba.1977.10-489.
  • Ninci J, Lang R, Davenport K, Lee A, Garner J, Moore M, Boutot A, Rispoli M, Lancioni G. An analysis of the generalization and maintenance of eye contact taught during play. Dev Neurorehabil. 2013;16(5):301–07. doi:10.3109/17518423.2012.730557.
  • Carr ME, Moore DW, Anderson A. Self-management interventions on students with autism: A meta-analysis of single subject research. Except Children. 2014;81(1):28–44. doi:10.1177/0014402914532235.
  • Shogren KA, Faggella-Luby M, Bae SJ, Wehmeyer ML. The effect of choice-making as an intervention for problem behaviour: A meta-analysis. J Pos Behav Interv. 2004;6(4):228–37. doi:10.1177/10983007040060040401.
  • Tate RL, McDonald S, Perdices M, Togher L, Schultz R, Savage S. Rating the methodological quality of single-subject designs and n-of-1 trials: introducing the Single-Case Experimental Design (SCED) Scale. Neuropsychol Rehabil. 2008;18(4):385–401. doi:10.1080/09602010802009201.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.