2,238
Views
0
CrossRef citations to date
0
Altmetric
Research Article

How to Evaluate a Tailor-made Social Work Intervention? Some Practice-Based Solutions with Single-Case Designs

, &

ABSTRACT

Purpose

With the increased attention to the principles of evidence-based practice (EBP), social workers are challenged to adapt their daily interventions accordingly when treating clients. They usually work with individual clients, all with their own specificities. Single-Case Experimental Designs (SCEDs) can be used to inform a social worker about the effectiveness of an intervention at the individual client level. In everyday social work practice, however, it is difficult to meet methodological requirements of SCEDs to find causal explanations. A concern is that repeated measurements prior to an intervention are required in most situations. This study aims to provide researchers with alternatives to repeated measurement when using the logic of SCED to apply EBP in their everyday practice.

Methods

In this study, we reviewed published single-case designs between January 1 and December 31, 2019, on types of SCEDs in the social domain, and how is dealt with baseline conditions.

Results

SCEDs and quasi-experimental alternatives are hardly published in situations when baseline data are not available. Four underused quasi-experimental strategies that can be employed when repeated measurement during baseline is not possible are as follows: retrospective baselines, theoretical inference, multiraters, and triangulation with qualitative data.

Discussion and Conclusion

The suggestions to work with single-case designs with quasi-experimental elements are meant to enable social workers to evaluate their interventions in a way that enhances mere narrative evaluations of the experiences of an intervention.

With the increased attention to the working principles of evidence-based practice (EBP), social workers are challenged to tailor their daily interventions to the needs of their clients (Mullen et al., Citation2008; Sackett et al., Citation2000). EBP implies that interventions are based on the best available evidence, clinical expertise and client preferences (Sackett et al., Citation2000). The EBP decision-making process is generally described in five steps: (1) formulating an answerable practice question; (2) searching for the best research evidence; (3) critically appraising the research evidence; (4) selecting the best intervention after integrating the research evidence with clinical expertise and client characteristics, preferences and values; and (5) evaluating practice decisions (Parrish & Rubin, Citation2011; Plath, Citation2014; Sackett et al., Citation2000).

Social workers must deal with individuals in need of help who each have their own specific needs, which means that tailor-made interventions at an individual level are necessary. Critique on applying EBP in social work has been that social workers have little time to search and read literature in order to select interventions, and limited autonomy to change their interventions within an organizational context (Nutley et al., Citation2009). Social workers are primarily concerned with the wellbeing of their clients and ways to enhance that in practice. Therefore, it is important that organizations take responsibility to facilitate the EBP process for the individual practitioners (Plath, Citation2013; Van der Zwet et al., Citation2020). A Single-Case Experimental Design (SCED) is an appropriate method to serve both goals (Kazdin, Citation2011). The design can be used following the five steps of EBP to investigate whether the interventions that are carried out in practice lead to behavioral change in individual clients. At the same time, SCEDs serve the social workers to find out if their interventions played out as they hoped for their clients.

The principles of performing SCEDs are complementary to EBP in social work, because both are dealing with the local and individual level. In social work, there might be a request for help or a behavioral problem that needs attention. The social worker then must pinpoint the answerable practice question (step 1). Next, the social worker and the client come up with an action plan (or intervention) that might provide in this situation, based on the integration of available research evidence with clinical expertise and client characteristics, preferences and values (steps 2–4). In an SCED, the need, problem or behavior is compared between phases within the subject(s), namely phases without (A-phases) and/or phases with an intervention (B- and C-phases). The simplest design is an AB-design. For example, a social worker treats a person with substance abuse. After considering clinical expertise, treatment and client’s preferences, together with the client the social worker decided to use motivational interviewing to address substance abuse. An AB-design is chosen to monitor the target behavior (i.e., substance abuse), before and after the intervention. Whether the substance abuse decreased over time is put to the test. The methodology of an SCED constitutes part of the evaluation of the intervention this way (step 5).

SCEDs can thus be used to evaluate the effectiveness of interventions in a specific client or a small group of clients. The outcome serves as an indicator for the social worker and the client to determine whether the intervention has achieved the intended result. Afterward a decision can be made to continue, change or withdraw the intervention. SCEDs can be used when standardized interventions are performed to assess whether the specific individual is responding as expected. But in some cases, following step 4 in the EBP-model, adaptation of the standardized intervention is needed to the specificities of the client. In such situations, SCEDs help to provide some first evidence for the new tailor-made intervention.

The What Works Clearinghouse (WWC) standards are recommended to adhere to when using SCEDs (Kratochwill et al., Citation2013). These standards are about the conditions that must be met for drawing valid conclusions from SCEDs. These standards refer to systematically manipulating the intervention conditions, including multiple assessors to obtain interobserver agreement, including a minimum number of replications and a minimum number of data points within phases.

Systematic manipulation means that the researcher should be able to change the conditions in the SCED in a systematic and planned way. For example, if the intervention is a type of therapy, when the therapy will be administered, for how long and how often, must be part of the study protocol. Tate et al. (Citation2016) emphasize that SCEDs should be reported with a detailed description of the intervention(s) in the design. Specifically, the intervention intensity should be described with respect to the appearance of the intervention sessions, the content of the sessions, the frequency of the intervention sessions and the total duration of the intervention. The researcher must ensure that the therapy is delivered consistently to eliminate any extraneous variables that could affect the results. To ensure that the intervention is delivered consistently and as intended, researchers must monitor fidelity. This involves assessing the degree to which the intervention is implemented according to the established plan.

Replication is central to the third standard of WWC. One finding of an effect is not very convincing, because it can also be caused by chance or natural growth. Sometimes it is possible to replicate findings within one and the same individual. For example, when conditions can be alternated during a study within the individual case. Suppose a student having concentration problems may improve his performance on exams by wearing a noise-canceling headphone. To evaluate the effect of this intervention, one can repeatedly test the performance with and without the headphone. The headphone intervention is reversible, meaning that the student’s performance will likely return to its previous level when not using the headphone. However, the majority of interventions by social workers are irreversible because they target a learning effect that lasts within the subject after withdrawal of the intervention. For example, when a social worker applies a series of motivational interviewing sessions to reduce substance abuse, it is expected that after the intervention the client will continue to resist the abuse. This means replication in similar cases or behaviors is required to comply with the WWC guidelines.

With respect to the minimum number of data points, continuous measurement of a target behavior across different phases of the design is important to comply to the fourth standard of the WWC. Consider a social worker who advises a refugee in the Netherlands to go to a language café at the local library to increase his Dutch vocabulary by talking to volunteers. The social worker assesses the progress weekly and notes after a few weeks that his vocabulary has improved. While the encounters in the library may be responsible for progress, it is also possible that the refugee picks up words in everyday life. To control for alternative explanations, vocabulary should also be monitored several times before the refugee starts attending the library. Continuous and multiple measurements during baseline and treatment phases is thus crucial for drawing causal conclusions from an SCED.

However, getting baseline information from clients is often impossible. Many social work situations including a baseline phase can be unethical or not practical (Kazdin, Citation2011). Postponing interventions to obtain baseline data may endanger or negatively affect clients. Withholding interventions longer than necessary can negatively affect the client’s trust and expectations of the social worker (Ledford & Gast, Citation2014). The working alliance between a social worker and a client is an important factor in the effectiveness of any intervention (Wampold, Citation2015). Besides, the social worker would be bound not to influence the behavior and to maintain the status quo when confronted with a client’s need. Research designs with high validity on paper are a different reality than what social workers encounter in daily practice. This reinforces the need for alternative research designs that are easy to perform (Rubin et al., Citation2016).

While the methodology of performing SCEDs is a promising and feasible format to work EBP in the domain of social work, there are practical limitations to its use in practice. Although different manuals on SCEDs acknowledge that there are practical problems in implementing SCEDs in applied settings (Barlow et al., Citation2009; Engel & Schutt, Citation2012; Satake et al., Citation2008), they provide little information about what to do if baseline measurements are not feasible.

The present study aims to discuss research designs that allow social workers to assess the effectiveness of their interventions in individual clients when baselines are lacking. We first review what type of single-case designs are used in social work and related domains. We summarize how is dealt with control conditions in the reviewed designs. Subsequently, we propose four possible strategies to include a control condition when baselines with repeated measurement are not available to the social worker.

Materials and method

To get an impression what type of single-case designs might be applied by social workers, Single-Case (Quasi-) Experimental Designs (SC(Q)ED) in the fields of behavioral sciences and the social domain are reviewed.

Search strategy

This review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines (Moher et al., Citation2009). A systematic search was made for all published studies in the social domain using an SC(Q)ED between January 1 and December 31, 2019. The search was limited to the most recent year that was fully available to us, to give a comprehensive overview of the state of the art. Each design with at least a control and experimental condition was included. Studies that were included generally met the WWC-guidelines for SCEDs with or without reservations (Kratochwill et al., Citation2013). While many variations on SCEDs are possible, the basic designs comprise Reversal/ABAB-designs (RDs), Multiple Baseline/Probe Designs (MBDs) and Alternate Treatment Designs (ATDs). In RDs, the effect of an intervention is examined by alternating the control condition with the intervention condition, with both conditions occurring at least twice in one individual case. In MBDs, the effect is examined by introducing the intervention in a small group of comparable individuals, settings or behaviors. In ATDs, the control condition and two or more experimental conditions are alternated over time within the same individual case. This is used when the effectiveness of different interventions is compared (Kazdin, Citation2011). We also included quasi-experimental studies that failed to meet the WWC standard with respect to sufficient replications, such as non-replicated AB-designs, without or with a return to baseline (ABA) or with a follow-up phase (ABB’).

Eligibility criteria

Using the database systems of EBSCO and Web of Science, a computer search of the literature was conducted, using any combination of the words “Single Case Design.” To exclude designs that were no single-case experimental designs and to include designs that were named differently, the search was refined with: “Single Case,” “N = 1,” “N-of-1,” “Multiple Baseline,” “Reversal Design,” “Alternate Treatment Design,” “Single Subject,” “Single-Subject.”

Selection process

contains a flowchart of the search procedures. Appropriate articles were obtained by searching for each of the search terms listed in both EBSCO and Web of Sciences. We excluded studies that did not follow an SC(Q)ED-procedure. Of the 1,114 initially identified studies, 826 were “single-center case series” and were excluded (). An additional 68 studies failed to be qualified as SC(Q)ED after first screening. After reading the abstracts of the remaining 220 studies, another 147 studies were excluded because they were no SC(Q)EDs (N = 73), not related to the social domain (N = 59), not available in full text (N = 13), or written only in French (N = 1) or Chinese (N = 1).

Figure 1. Flowchart of search procedure with reasons for exclusion of studies and the total number of studies included.

Figure 1. Flowchart of search procedure with reasons for exclusion of studies and the total number of studies included.

Data abstraction

We classified the research designs and assessed (1) the type of SCED; (2) the target behavior of the intervention; (3) the intervention; (4) the minimum number of control condition data; (5) whether the designs contained additional elements to strengthen the design explicitly in the method-section or implicitly in the discussion of the study, such as qualitative or systemic elements; and (6) the number of cases in the design(s). When an article reported more than one SCED, all SCEDs were retrieved. The elements were listed in Table A1 (Appendix A).

Results

In the included 73 studies, five different types of SC(Q)EDs were used: AB/ABA/ABB’ designs (11), Alternate Treatment Designs (ATDs, 8), Reversal/ABAB-Designs (RDs, 11), Multiple Probe Designs (MPD, 4) and Multiple Baseline Design (MBDs, 46). Seven studies contained both a MDB and ATD/RD. In AB-designs, a period without intervention is followed by a period with intervention within a participant. During both phases, some target behavior is assessed and compared between phases. In this design, it is not possible to replicate effects. The design is sometimes extended to an AB design with follow-up phase (in which the effect of B would persist) or ABA design (in which a return to A would be expected). Such designs are considered non-experimental because at least three replications of an effect are needed to meet standards (Kratochwill et al., Citation2013). In RDs and ATDs, at least three replications of an effect are tested in the individual participant. In RDs, the treatment is withdrawn and started several times within the design (e.g., ABAB-design), while in ATDs, two or more treatment conditions interchange. In multiple baseline designs across participants, a small number of participants are monitored during at least one baseline phase (A) and at least one intervention phase (B). MBDs have the advantage over RDs and ATDs that showing the replications of an effect of the intervention within a single participant is not necessary. The replication of the effect is shown in comparable cases in an MBD instead. Sometimes MBDs can be used within a single participant, in the case of multiple baselines across behaviors or settings. In MDBs, measurements are taken continuously during baseline. In MPDs, the continuous measurement is replaced with probes.

The relatively low number of ATDs and RDs (19 of 73 designs (26%)) points to the practical difficulty to replicate effects within the participant in the social domain. A motivational interview cannot be undone, the effect of a training cannot be reset, a new way of financial coping cannot be unlearned immediately. Hence, MBDs are usually used on a small number of comparable participants with one baseline and intervention phase, while interventions are not reversed. In 11 of the 19 studies with a single participant, there was no replication of an effect, but only an AB or ABA-design. These designs are qualified as semi-experimental or non-experimental designs. Of the published SC(Q)EDs in the social domain on the individual participant, less than half of the designs met the WWC standard of replications.

Meeting the WWC-standards requires including a minimum of five measurements in each phase in the SCED, or at least three to meet the standards with reservations (Kratochwill et al., Citation2013). Three of the 73 studies reported a baseline period in which at least one of the study participants had less than three measurements. Two of the three studies explicitly stated that the study was a mixed-method study and that sources outside the SC(Q)ED were used to obtain better internal validity of the study (Maree, Citation2019; Ramos & Ramos, Citation2019). When specific clients within studies had a short baseline period, this was reported as unintended and as a limitation of the study. For example, in a study on the effectiveness of an app to improve memory in elderly people with dementia, one participant was so motivated to use the app that it was considered unethical to continue the baseline measurement (McGoldrick et al., Citation2019). In another study, play therapy was introduced to children earlier because of teachers’ growing concern about the problematic behaviors (Dillman Taylor et al., Citation2019). And in a study among smokers, one participant stopped smoking before the intervention started. He was so motivated after enrolling in the program that he could not wait for the baseline measurement to stop (Wen et al., Citation2019). In these cases, the baseline principle was violated, but from a treatment perspective there is a positive effect. In those studies, the short baseline phases are reported as a methodological problem that endangers the validity of the outcomes. As a result, clients are removed from the study and/or conclusions are drawn with modesty. The examples show how clinically relevant outcomes are sometimes excluded because of research standards.

From this literature review, we may conclude that SC(Q)EDs are frequently applied research designs in the social domain, mainly used when repeated measurement at baseline can be successfully collected, or reversion of the intervention is possible. In situations where reversal of the intervention and continuous measurement at baseline are not possible, SC(Q)EDs are hardly present in our sample of studies. Two of the 73 studies have explicitly replaced continuous measurement with other sources of baseline data. This means that currently published SC(Q)EDs rarely consider situations where baseline data cannot be collected. The methodology of performing SC(Q)EDs to evaluate interventions in individual clients is promising for social workers who are trying to embed EBP in their everyday work, but they have to deal with situations where it is not possible to adhere to the WWC standards. In this paper, we discuss four possible strategies social workers may consider when baseline data are impossible or impractical to collect: retrospective baseline data, theoretical inference, social networks as multiraters of behavior and triangulation of results with mixing qualitative data in the design.

Retrospective baseline data

If baseline data cannot be collected prospectively before starting to supervise a client, one may consider reconstructing the baseline from existing records or retrospective interviews. Retrospective baselines can be collected from previous records, such as school records, medical charts, or other documentation. For example, a school has records of their students that contain information about, e.g., past behavior, absenteeism, deviant behavior, alcohol and drugs abuse, and violent behavior. One is of course limited to the available information (Engel & Schutt, Citation2012).

Strikingly, none of the selected studies in the review reconstructed baseline data from records, memory or otherwise. Memory-data was used in a study among persons with a smoking addiction, but not to construct baseline data (Wen et al., Citation2019). To assess data, at each intervention visit, the counselor first held a timeline follow-back interview to assess the participant’s smoking frequency since the last visit. Follow-back interviews could also be a viable technique if one does not have the opportunity to collect baseline data before starting an intervention.

Data based on reports are not always completely reliable and memory can also be sensitive to measurement errors, which can make reconstructions unreliable (Thigpen, Citation2019). However, extreme accuracy in a reconstruction is not always necessary. The primary functions of a baseline measurement are to assess the severity of the behavior to be treated and to predict future levels without social work intervention (Engel & Schutt, Citation2012). The assessment function is established with at least one baseline measurement, since information about the severity of the problem is available at the start of the study. It is crucial for the predictive function to know in advance whether the social problem is expected to remain the same without intervention. Follow-back interviews can expand on the information by asking about variations in severity to get an idea of the variability of the problem behavior. Baseline reconstruction should focus on examining how the social problem has evolved in recent history – whether it has been stable or recently changed. People and records may not be completely accurate, but they should be able to remember whether their behavior has improved or worsened. When past behavior is obtained in this way, the social worker has a rough idea of expected future behavior. If the problem behavior was stable before the intervention phase, a positive change during the intervention phase would support the effectiveness by the intervention. If the problem behavior worsened during the baseline, stability or improvement during the intervention phase would support the intervention. If the behavior was already improving, the necessity of an intervention is questionable and it is hard to assign eventual further improvement to the intervention. illustrates the expected data patterns in three different scenarios and how empirical data can be compared to extrapolated data patterns based on retrospective information about the improvement, decline, or stability of the behavior before the intervention started.

Figure 2. Examples of retrospective acquired baselines worsening, stable and improving behavior before the intervention gives information about the predictive behavior during intervention. The light grey line during the baseline denotes the retrospective constructed baseline. The actual findings in the intervention phase (in black) can be compared to the predicted grey linear extension of the baseline.

Figure 2. Examples of retrospective acquired baselines worsening, stable and improving behavior before the intervention gives information about the predictive behavior during intervention. The light grey line during the baseline denotes the retrospective constructed baseline. The actual findings in the intervention phase (in black) can be compared to the predicted grey linear extension of the baseline.

Theoretical inference

If a social worker can only obtain information about the current state of the severity of the problem behaviors and has no possibility to collect retrospective data, one can still have a theoretical idea of the expected future behavior of the client without intervention. In some cases, the assessment of the pattern in baseline-behavior is superficial, for example, when training memory skills in degenerative diseases such as dementia.

According to Horner and Baer (Citation1978), when SCEDs are employed to find functional relations between an intervention and an outcome, there are three situations where the baseline could or should be completely replaced by probes: (1) when the baseline is reactive, (2) impractical, or (3) unnecessary (strong a priori assumptions of baseline stability can be made).

Following this line of thoughts, when a client engages in a certain type of behavior, the prediction of future behavior does not necessarily need to be empirically based on past or current behavior if there is a clear theoretical idea about the future prospect of the client’s behavior. If no improvement is to be expected from theoretical evidence about the type of behavior, one might as well evaluate the intervention after a single measurement of current behavior. People who must deal with long-term financial problems, addictions or anxiety, people who stutter, youth with structural behavioral problems, might be examples of cases in which this could apply, especially when there have been forms of treatment previously without effect. In such a case, a single measurement at the start of the intervention is sufficient to determine whether there is a functional relationship between the intervention and a change in behavior.

Multiraters

When baseline measurements are not feasible, in some social work situations they can be replaced by using the judgment of multiple raters (Spreen et al., Citation2010). Rather than prospectively monitoring social problems or behaviors during a baseline phase several times, they can be (retrospectively) assessed by multiple raters from the social network of the client. Six studies in the literature review recognized the importance of including network members in the design, but they were only used on top of a repeated measurement baseline (Coogle et al., Citation2019; D’agostino et al., Citation2020; Lim et al., Citation2019; McGoldrick et al., Citation2019; Raulston et al., Citation2019; Tsao, Citation2019). When multiraters are used, multiple informants should assess the same behavior in all phases of a design. In ,this is illustrated in the case of five multiraters who assess the situation pre- and post-intervention. Repeating measurement in this way has the advantage of getting an idea of the possible variability of the behavior of the client and includes several “expert opinions” about the behavior of the client into the design.

Figure 3. An example of a single pre-assessment among five multiraters pre-intervention, followed up with post-assessment in two occasions during/after intervention.

Figure 3. An example of a single pre-assessment among five multiraters pre-intervention, followed up with post-assessment in two occasions during/after intervention.

Engel and Schutt (Citation2012) name one of the possible variations of MBDs to apply the effect to test between different settings: that is, introducing an intervention in phases in different contexts such as school, at home and while playing with friends. The methodological problem could be that introducing the intervention in one setting can also influence the behavior in another setting. However, with multiple raters, this is seen as an essential feature of the design: the effect of the intervention is not replicated across settings but is to be confirmed from multiple perspectives. Especially when social workers have a systemic approach, this “MBD across perspectives”-technique might be an obvious choice.

Triangulation with qualitative data to control for rival hypotheses

The problem with SC(Q)EDs having few baseline-measurements is that an improvement in the problem behavior might have been caused by something other than the intervention. It might have been natural growth, an external event or coincidence that caused the change. Without baselines, it is more difficult to exclude alternative explanations of the behavioral change. An alternative can be to collect qualitative material in addition to quantitative material. The combination of both levels of material can serve several goals, including using results from one method to support conclusions of the complementary method (Creswell & Clar, Citation2017; Curry et al., Citation2009). Quantitative research is typically deductive, occupied with testing a certain theoretical hypothesis. Qualitative research is a typically inductive approach in which theory is formulated from the lived experience of participants in a study. The quantitative studies render a statistical association, but the qualitative analyses give the theoretical mechanisms. By collecting data on the lived experience of the participants in the interventions, control for alternative explanations is included in the design. Member checks are also a commonly used strategy to validate research results (Thomas, Citation2017).

Although most SC(Q)EDs in the literature review did not include qualitative data, some (N = 4) did explicitly frame the design as being mixed methods (Campbell et al., Citation2019; Maree, Citation2019; Ramos & Ramos, Citation2019; Spence et al., Citation2019). Of those four studies, two should be considered “not meeting standards” of current WWC standards, due to violations of the number of replications and number of measurements during the phases. Still, by using triangulation in the design, different explanations of a perceived difference between baseline and intervention were dealt with. The qualitative data give more insight in the mechanisms of change. Campbell et al. (Citation2019) illustrate this, when they transcribed interviews of a practitioner and her patient to obtain information about how the client experienced the interventions by interpretative phenomenological experience (Smith & Osborn, Citation2015), stating: “Patients” experiences and the corresponding disability or ill-being resultant of chronic pain are important in further delineating the efficacy of chronic pain management treatment methods. […] Situating and understanding a participant’s experiences and meaning making can help to interpret findings from quantitative outcomes” (Campbell et al., Citation2019, p. 6).

Jackson and Hanline (Citation2020) interviewed children and their mothers about the importance of the interventions in their SCED. Mothers were asked to watch video clips of their children reading texts, and to reflect on their understanding of their children’s behaviors. Although this study was not explicitly called a mixed method study, the qualitative findings helped confirm or refute the hypotheses in the study and rule out alternative hypotheses by asking participants and an important member of their social network what they thought what happened. Lester and Murrell (Citation2019) emphasized the importance of an exit-interview in the discussion of their study on the effects of mindfulness interventions for college students with ADHD, stating: “future studies may benefit from incorporating a formal ‘exit interview’ to better understand which components of the intervention were most favored and/or useful for the student. Although some of this data was gathered informally, it was not collected systematically from all participants” (Lester & Murrell, Citation2019, p. 216). Several other studies in the review included qualitative data to obtain an indication of the social validity of the data, but the qualitative data was not used to confirm the conclusions of the quantitative part about the functional relationship between the intervention and an outcome. They did include qualitative data but did not include it in the method section of the study (Keyes et al., Citation2019), or collected qualitative data on request of the practitioner only (Peyroux & Franck, Citation2019), or did present the analysis of the qualitative data elsewhere, separated from the findings of the quantitative SC(Q)ED (Meinzen Derr et al., Citation2019).

Mixed methods in an SC(Q)ED do not necessarily have to be limited to a combination with qualitative methods. For example, SC(Q)EDs could be part of a larger Randomized Controlled Trial (RCT), or several SC(Q)EDs can be aggregated in a group design after collecting data in numerous cases. With respect to the use of tailored interventions, the use of qualitative data to strengthen the validity of the outcomes, seems the most feasible option to include mixed methods into an SC(Q)ED.

Discussion

To incorporate evidence-based practice in the field of social work, social workers need methods that they can “easily” apply in their everyday practice. In evaluating treatment, action plans or interventions in single clients, the methodology of performing SCEDs provides social workers with a guideline to work EBP with their clients.

However, a central implication of properly employing SCEDs is the requirement that the severity of the social problem or behavior must be measured repeatedly during a baseline period. Another implication working in line with the methodology of SCEDs is the requirement to replicate effects at least three times. Both requirements are often problematic in social work practice implying that the empirical context in which social workers are not covered by the theory of evaluating effect according to the WWC standards. Not taking baseline data into account weakens the causal evidence. Hser et al. (Citation2001) note that in intervention research, there are few studies using SCEDs, because it is hard to reach a minimum required number of measurements in each phase, or because multiple participants must be included in a MBD.

In this study, we provided, for situations where baseline measurements are not possible or desirable, four possible strategies social workers may follow when they want to work according to EBP and evaluate their interventions by using the systematic approach of SCEDs. From the literature review, we signal that such strategies are currently unavailable in published SC(Q)EDs in the behavioral sciences and social domain. Retrospective measurement, using multiraters, theoretical inference and the addition of qualitative data can serve to increase the quality of the evaluation of tailor-made interventions or interventions for specific clients. These can be valuable additions to evaluate tailor-made interventions in everyday practice of social work. When social workers involve the network of their clients in their interventions, they can give them a role as rater of behavior or past behavior without much effort. Also, qualitative data from exit-interviews, reports during the process, will for a large part be obtained through the regular working process already.

The literature search was limited to one year (2019). We sought to provide an overview of the current of SC(Q)EDs, with examples closely related to social work practice, and included all SC(Q)EDs from the social and behavioral sciences. Not all of the examples from the review are directly transferable to social work practice. For example, the specific demands of a study about communication technology for children, may not relate to everyday social work (Meinzen Derr et al., Citation2019). On the other hand, we excluded designs that were not available in English (studies in French and Chinese were excluded) or otherwise inaccessible to us, published before or after 2019. Also, we might have failed to identify SC(Q)EDs that were part of a larger setup. Yet, the methodological choices made within the designs in the review can be of value for the social worker and the review served as an indication. It would be valuable to revisit the methodological aspects of SC(Q)EDs every now and then, and monitor new methodological ideas that master the challenges that arise in applied settings.

WWC standards related to SCEDs focus on the values of SCEDs to contribute to EBP but mainly focus on the aspect for advancements in scientific knowledge (Kratochwill et al., Citation2013). EBP also entails the evaluation of practice decisions. If SCEDs are used to evaluate practice decisions that have already been shaped by the best available research evidence, the demands for causal inference might be too strict for evaluating daily practice. Clinical decisions to initiate or modify treatment might differ from research decisions when experimental control is involved.

In providing new strategies for SC(Q)EDs as practice demands to do so, and in reviewing the methodological aspects of published SC(Q)EDs, we focused on two of the four WWC standards for SCEDs (Kratochwill et al., Citation2013), namely with respect to the number of replications and data points in the design phases. These standards give rise to challenges in the assessment process. However, it is also important to consider how guidelines regarding control over the experimental conditions and inter-assessor agreement, shape SC(Q)EDs in applied social work. For example, the review showed that MBD (across participants) is the most used design. In this design, the possibility to tailor the intervention to the individual is limited, the intervention that is performed must remain the same for each participant. This raises the question what the individual flexibility of an intervention might be under the research standard of experimental control and treatment fidelity.

Part of EBP is monitoring the choices that are made, based on the preferences of the client, specific situation and available evidence (Mullen et al., Citation2008). This study has discussed quasi-experimental elements of single-case designs as a method to enable social workers to evaluate their interventions in an EBP way that enhances mere narrative evaluations of the experiences of an intervention. Evaluation should primarily help to inform the practitioner, the client and possibly the social network of the client about changes caused by an intervention. It may not always be possible to exclude alternative explanations, but especially for innovative new ideas of tailor-made interventions, research strategies that give a quick but solid impression of the effectiveness of an intervention are preferable to implementing the innovative new ideas, without any form of systematic evaluation. If only experimental designs were performed, social workers that work in an EBP way, might have to wait to implement these innovative ideas, but that does not reflect the dynamic situation in the field. All the introduced strategies are to be employed with caution, since threats to internal validity of the designs lure, but emphasis always has to be on service over research in practical settings.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

References

  • Alfonsson, S., Parling, T., & Englund, J. (2019). Tailored text message prompts to increase therapy homework adherence: A single-case randomised controlled study. Behaviour Change, 36(3), 180–191. https://doi.org/10.1017/bec.2019.10
  • Alsaad, M., McCabe, P., & Purcell, A. (2019). The application of the maximal opposition therapy approach to an Arabic-speaking child. Journal of Communication Disorders, 81, 105–113. https://doi.org/10.1016/j.jcomdis.2019.105913
  • Anderson, H. K., Hayes, S. L., & Smith, J. P. (2019). Animal assisted therapy in pediatric speech-language therapy with a preschool child with severe language delay: A single-subject design. Internet Journal of Allied Health Sciences & Practice, 17(3), 1–7. https://doi.org/10.46743/1540-580X/2019.1770
  • Archambault, C., Mercer, S. H., Cheng, M. P., & Saqui, S. (2019). Lire en francais: Cross-linguistic effects of reading fluency interventions in French immersion programs. Canadian Journal of School Psychology, 34(2), 113–132. https://doi.org/10.1177/0829573518757790
  • Barlow, D. H., Nock, M. K., & Hersen, M. (2009). Single case experimental designs: Strategies for studying behaviours for change. Pearson.
  • Barton, E. E., Gossett, S., Waters, M. C., Murray, R., & Francis, R. (2019). Increasing play complexity in a young child with autism. Focus on Autism and Other Developmental Disabilities, 34(2), 81–90. https://doi.org/10.1177/1088357618800493
  • Barton, E. E., Rigor, M. N., Pokorski, E. A., Velez, M., & Domingo, M. (2019). Using text messaging to deliver performance feedback to preservice early childhood teachers. Topics in Early Childhood Special Education, 39(2), 88–102. https://doi.org/10.1177/0271121418800016
  • Begeny, J. C. (2019). Evaluating contextually adapted reading interventions with third-grade, Costa Rican students experiencing significant reading difficulties. School Psychology International, 40(1), 3–24. https://doi.org/10.1177/0143034318796875
  • Birri, N. L. (2019). A personal narrative intervention for adults with autism and intellectual disability: A single subject multiple baseline design [ Doctoral dissertation, University of Cincinnati]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1539079809808646
  • Bowers, H. M., & Wroe, A. L. (2019). Changing beliefs about emotions in IBS: A single case design. Behavioural and Cognitive Psychotherapy, 47(3), 303–317. https://doi.org/10.1017/S1352465818000589
  • Brodhead, M. T., Kim, S. Y., Rispoli, M. J., Sipila, E. S., & Bak, M. Y. S. (2019). A pilot evaluation of a treatment package to teach social conversation via video-chat. Journal of Autism and Developmental Disorders, 49(8), 3316–3327. https://doi.org/10.1007/s10803-019-04055-4
  • Caballero-Suarez, N. P., Iglesias, M. C., Rodriguez Estrada, E., Reyes Teran, G., & Rosas, A. R. (2019). Effects of cognitive-behavioural therapy on anxiety, depression and condom use in people with HIV in Mexico City: A pilot study. Psychology Health & Medicine, 24(1), 115–125. https://doi.org/10.1080/13548506.2018.1503694
  • Calcagni, N., & Gana, K. (2019). The effectiveness of an “exergame” on perceived stress and self-efficacy in a parkinson’s disease patient: Results of a single case ABAB protocol. Pratiques Psychologiques, 25(2), 205–218. https://doi.org/10.1016/j.prps.2018.11.004
  • Calet, N., Pérez-Morenilla, M. C., & De Los Santos-Roig, M. (2019). Overcoming reading comprehension difficulties through a prosodic reading intervention: A single-case study. Child Language Teaching and Therapy, 35(1), 75–88. https://doi.org/10.1177/0265659019826252
  • Campbell, E., Burger, B., & Ala-Ruona, E. (2019). A single-case, mixed methods study exploring the role of music listening in vibroacoustic treatment. Voices: A World Forum for Music Therapy, 19(2), 27–35. https://doi.org/10.15845/voices.v19i2.2556
  • Caneiro, J. P., Smith, A., Linton, S. J., Moseley, G. L., & O’sullivan, P. (2019). How does change unfold? an evaluation of the process of change in four people with chronic low back pain and high pain-related fear managed with cognitive functional therapy: A replicated single-case experimental design study. Behaviour Research and Therapy, 117, 28–39. https://doi.org/10.1016/j.brat.2019.02.007
  • Caron, E. B., & Dozier, M. (2019). Effects of fidelity-focused consultation on clinicians’ implementation: An exploratory multiple baseline design. Administration and Policy in Mental Health and Mental Health Services Research, 46(4), 445–457. https://doi.org/10.1007/s10488-019-00924-3
  • Clough, A. J., Hilmer, S. N., Naismith, S. L., & Gnjidic, D. (2019). The feasibility of using N-of-1 trials to investigate deprescribing in older adults with dementia: A pilot study. Healthcare, 7(4), 161. Basel, Switzerland, https://doi.org/10.3390/healthcare7040161
  • Collier-Meek, M. A., Sanetti, L. M. H., Levin, J. R., Kratochwill, T. R., & Boyle, A. M. (2019). Evaluating implementation supports delivered within problem-solving consultation. Journal of School Psychology, 72, 91–111. https://doi.org/10.1016/j.jsp.2018.12.002
  • Coogle, C. G., Larson, A. L., Ottley, J. R., Root, A. K., & Bougher-Muckian, H. (2019). Performance-based feedback to enhance early interventionist’s practice and caregiver and child outcomes. Topics in Early Childhood Special Education, 39(1), 32–44. https://doi.org/10.1177/0271121419831414
  • Coogle, C. G., Nagro, S., Regan, K., O’brien, K. M., & Ottley, J. R. The impact of real-time feedback and video analysis on early childhood teachers’ practice. (2019). Topics in Early Childhood Special Education, 41(4), 027112141985714. UNSP 0271121419857142. https://doi.org/10.1177/0271121419857142
  • Costello, M. S., Sheibanee, B. D., Ricketts, A., Hirsh, J. L., & Deochand, N. (2019). Exploration of social reinforcement for gambling in single case designs. Analysis of Gambling Behavior, 12(1), 1–21.
  • Creswell, J. W., & Clar, V. L. P. (2017). Designing and conducting mixed methods research (Third edition). SAGE Publications. Retrieved from, https://ebookcentral.proquest.com/lib/[SITE_ID]/detail.action?docID=5439905
  • Curry, L. A., Nembhard, I. M., & Bradley, E. H. (2009). Qualitative and mixed methods provide unique contributions to outcomes research. Circulation, 119(10), 1442–1452. https://doi.org/10.1161/CIRCULATIONAHA.107.742775
  • D’agostino, S., Douglas, S. N., & Horton, E. (2020). Inclusive preschool practitioners’ implementation of naturalistic developmental behavioral intervention using telehealth training. Journal of Autism and Developmental Disorders, 50(3), 864–880. https://doi.org/10.1007/s10803-019-04319-z
  • Davidson, S. J., & O’connor, R. E. (2019). An intervention using morphology to derive word meanings for English language learners. Journal of Applied Behavior Analysis, 52(2), 394–407. https://doi.org/10.1002/jaba.539
  • Dillman Taylor, D., Meany-Walen, K., Nelson, K. M., & Gungor, A. (2019). Investigating group Adlerian play therapy for children with disruptive behaviors: A single-case research design. International Journal of Play Therapy, 28(3), 168–182. https://doi.org/10.1037/pla0000094
  • Engel, R. J., & Schutt, R. K. (2012). The practice of research in social work (3. ed. ed.). SAGE.
  • Fallon, L. M., Marcotte, A. M., & Ferron, J. M. (2020). Measuring academic output during the good behavior game: A single case design study. Journal of Positive Behavior Interventions, 22(4), 246–258. https://doi.org/10.1177/1098300719872778
  • Gertler, P., & Tate, R. L. (2019). Behavioural activation therapy to improve participation in adults with depression following brain injury: A single-case experimental design study. Neuropsychological Rehabilitation, 31(3), 369–391. https://doi.org/10.1080/09602011.2019.1696212
  • Gullon-Rivera, A. L., Millar, R., & Flemmings, S. (2019). Training parents to create and implement social stories (TM): Promoting social competence in children without disabilities. Family Relations, 68(4), 450–468. https://doi.org/10.1111/fare.12374
  • Hicks, S. (2019). Using the evidence base for social anxiety to understand and treat paranoia: A single case experimental design. The Cognitive Behaviour Therapist, 12, E23. https://doi.org/10.1017/S1754470X19000126
  • Horner, R. D., & Baer, D. M. (1978). Multiple-probe technique: A variation on the multiple baseline. Journal of Applied Behavior Analysis, 11(1), 189–196. Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/16795582
  • Hser, Y., Shen, H., Chou, C., Messer, S. C., & Anglin, M. D. (2001). Analytic approaches for assessing long-term treatment effects. Evaluation Review, 25(2), 233–262. https://doi.org/10.1177/0193841X0102500206
  • Hua, Y., Hinzman, M., Yuan, C., & Langel, K. B. (2020). Comparing the effects of two reading interventions using a randomized alternating treatment design. Exceptional Children, 86(4), 355–373. https://doi.org/10.1177/0014402919881357
  • Hwang, Y., & Levin, J. R. (2019). Application of a single-case intervention procedure to assess the replicability of a two-component instructional strategy. Contemporary Educational Psychology, 56, 161–170. https://doi.org/10.1016/j.cedpsych.2018.10.006
  • Hyppa-Martin, J. K., Stromberg, A. M., Chen, M., & Mizuko, M. I. (2020). Comparing embedded and non-embedded visual scene displays for one adult diagnosed with autism spectrum disorder: A clinical application of single case design. Child Language Teaching & Therapy, 36(1), 3–18. https://doi.org/10.1177/0265659019884111
  • Jackson, E. M., & Hanline, M. F. (2020). Using a concept map with RECALL to increase the comprehension of science texts for children with autism. Focus on Autism and Other Developmental Disabilities, 35(2), 90–100. https://doi.org/10.1177/1088357619889933
  • Joyner, D., Wengreen, H., Aguilar, S., & Madden, G. (2019). Effects of the FIT game on physical activity in sixth graders: A pilot reversal design intervention study. JMIR Serious Games, 7(2), e13051. https://doi.org/10.2196/13051
  • Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings. Oxford University Press.
  • Kellett, S., & Lees, S. (2019). Quasi-experimental N = 1 evaluation of the effectiveness of cognitive analytic therapy for dependent personality disorder. Journal of Psychotherapy Integration, 30(3), 458–475. https://doi.org/10.1037/int0000170
  • Keyes, A., Deale, A., Foster, C., & Veale, D. (2019). Time intensive cognitive behavioural therapy for a specific phobia of vomiting: A single case experimental design. Journal of Behavior Therapy and Experimental Psychiatry, 66, 101–123. https://doi.org/10.1016/j.jbtep.2019.101523
  • Khodabakhshi-Koolaee, A., Akhalaghi-Yazdi, R., & Sayah, M. H. (2019). Investigating gestalt-based play therapy on anxiety and loneliness in female labour children with sexual abuse: A single case research design (SCRD). Journal of Client Centered Nursing Care, 5(3), 147–156. https://doi.org/10.32598/JCCNC.5.3.147
  • Killmeyer, S., Kaczmarek, L., Kostewicz, D., & Yelich, A. (2019). Contingent imitation and young children at-risk for autism spectrum disorder. Journal of Early Intervention, 41(2), 141–158. https://doi.org/10.1177/1053815118819230
  • Knight, R., Davies, R., Salkovskis, P. M., & Gregory, J. D. (2019). CBT with an adolescent with hoarding disorder-a single-case experimental design. International Journal of Cognitive Therapy, 12(2), 146–156. https://doi.org/10.1007/s41811-018-0033-x
  • Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34(1), 26–38. https://doi.org/10.1177/0741932512452794
  • Lapp, M., Corveleyn, X., & Quaderi, A. (2019). Schema therapy in depression of elderly in homecare: Three AB single case experimental design in multiple baseline. Pratiques Psychologiques, 25(2), 183–204. https://doi.org/10.1016/j.prps.2018.12.002
  • Lebrun, C., Gely-Nargeot, M., Geny, S., Rossignol, A., & Bayard, C. (2019). Efficacy of cognitive behavioral therapy for insomnia comorbid to parkinson’s disease: A focus on psychological and daytime functioning with a single-case design with multiple baselines. Journal of Clinical Psychology, 76(3), 356–376. https://doi.org/10.1002/jclp.22883
  • Ledford, J. R., & Gast, D. L. (2014). Single case research methodology 2nd. Routledge. https://doi.org/10.4324/9780203521892, Retrieved from, https://www.taylorfrancis.com/books/e/9780203521892.
  • Lester, E. G., & Murrell, A. R. (2019). Mindfulness interventions for college students with ADHD: A multiple single case research design. Journal of College Student Psychotherapy, 33(3), 199–220. https://doi.org/10.1080/87568225.2018.1450107
  • Lim, J., McCabe, P., & Purcell, A. (2019). ‘Another tool in my toolbox’: Training school teaching assistants to use dynamic temporal and tactile cueing with children with childhood apraxia of speech. Child Language Teaching & Therapy, 35(3), 241–256. https://doi.org/10.1177/0265659019874858
  • Lopez, M. E., Thorp, S. R., Dekker, M., Noorollah, A., Zerbi, G., Payne, L. A., & Stoddard, J. A. (2019). The unified protocol for anxiety and depression with comorbid borderline personality disorder: A single case design clinical series. The Cognitive Behaviour Therapist, 12, E37. https://doi.org/10.1017/S1754470X19000254
  • Love, H. R., Horn, E., & An, Z. (2019). Teaching observational data collection to early childhood preservice educators. Teacher Education and Special Education, 42(4), 297–319. https://doi.org/10.1177/0888406419836147
  • Lundeborg Hammarström, I., Svensson, R., & Myrberg, K. (2019). A shift of treatment approach in speech language pathology services for children with speech sound disorders - a single case study of an intense intervention based on non-linear phonology and motor-learning principles. Clinical Linguistics & Phonetics, 33(6), 518–531. https://doi.org/10.1080/02699206.2018.1552990
  • Maas, E., Gildersleeve-Neumann, C., Jakielski, K., Kovacs, N., Stoeckel, R., Vradelis, H., & Welsh, M. (2019). Bang for your buck: A single-case experimental design study of practice amount and distribution in treatment for childhood apraxia of speech. Journal of Speech, Language, and Hearing Research: JSLHR, 62(9), 3160–3182. https://doi.org/10.1044/2019_JSLHR-S-18-0212
  • Maree, J. G. (2019). Career construction counselling aimed at enhancing the narratability and career resilience of a young girl with a poor sense of self-worth. Early Child Development and Care, 190(16), 2646–2662. https://doi.org/10.1080/03004430.2019.1622536
  • Martens, K., Barry, T. J., Takano, K., Onghena, P., & Raes, F. (2019). Efficacy of online memory specificity training in adults with a history of depression, using a multiple baseline across participants design. Internet Interventions, 18, 100259. https://doi.org/10.1016/j.invent.2019.100259
  • Martin, R. A., Taylor, W. J., Surgenor, L. J., Graham, F. P., Levack, W. M. M., & Blampied, N. M. (2020). Evaluating the effectiveness of therapeutic horse riding for children and young people experiencing disability: A single-case experimental design study. Disability and Rehabilitation, 42(26), 3734–3743. https://doi.org/10.1080/09638288.2019.1610083
  • McGoldrick, C., Crawford, S., & Evans, J. J. (2019). MindMate: A single case experimental design study of a reminder system for people with dementia. Neuropsychological Rehabilitation, 31(1), 18–38. https://doi.org/10.1080/09602011.2019.1653936
  • Meinzen Derr, J., Sheldon, R. M., Henry, S., Grether, S. M., Smith, L. E., Mays, L., Riddle, I., Altaye, M., & Wiley, S. (2019). Enhancing language in children who are deaf/hard-of-hearing using augmentative and alternative communication technology strategies. International Journal of Pediatric Otorhinolaryngology, 125, 23–31. https://doi.org/10.1016/j.ijporl.2019.06.015
  • Miles, S., Brown, G., Corfe, A., Hallett, C., Wingrove, J., Wheatley, J., & Veale, D. (2019). Time-intensive behavioural activation for depression: A multiple baseline study. Journal of Behavior Therapy and Experimental Psychiatry, 63, 36–47. https://doi.org/10.1016/j.jbtep.2018.12.008
  • Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). PRISMA group: Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. International Journal of Surgery, 8(5), 336–341. https://doi.org/10.1016/j.ijsu.2010.02.007
  • Mullen, E. J., Bledsoe, S. E., & Bellamy, J. L. (2008). Implementing evidence-based social work practice. Research on Social Work Practice, 18(4), 325–338. https://doi.org/10.1177/1049731506297827
  • Nordgren, P. M. (2019). Precursors of language development in ASC: A longitudinal single-subject study of gestures in relation to phonetic prosody. Journal of Intellectual Disabilities, 23(1), 19–38. https://doi.org/10.1177/1744629517710999
  • Nutley, S., Walter, I., & Davies, H. T. O. (2009). Promoting evidence-based practice: Models and mechanisms from cross-sector review. Research on Social Work Practice, 19(5), 552–559. https://doi.org/10.1177/1049731509335496
  • O’flaherty, C., Barton, E. E., Winchester, C., & Domingo, M. (2019). Coaching teachers to promote social interactions with toddlers. Journal of Positive Behavior Interventions, 21(4), 199–212. https://doi.org/10.1177/1098300719851794
  • Parrish, D. E., & Rubin, A. (2011). An effective model for continuing education training in evidence-based practice. Research on Social Work Practice, 21(1), 77–87. https://doi.org/10.1177/1049731509359187
  • Pechey, R., Jenkins, H., Cartwright, E., & Marteau, T. M. (2019). Altering the availability of healthier vs. less healthy items in UK hospital vending machines: A multiple treatment reversal design. The International Journal of Behavioral Nutrition and Physical Activity, 16(1), 114. https://doi.org/10.1186/s12966-019-0883-5
  • Perez-Saez, E., Perez-Redondo, E., & Gonzalez-Ingelmo, E. (2020). Effects of dog-assisted therapy on social behaviors and emotional expressions: A single-case experimental design in 3 people with dementia. Journal of Geriatric Psychiatry and Neurology, 33(2), 109–119. https://doi.org/10.1177/0891988719868306
  • Peyroux, E., & Franck, N. (2019). Is social cognitive training efficient in autism? A pilot single-case study using the RC2S+ program. Neurocase, 25(6), 217–224. https://doi.org/10.1080/13554794.2019.1666877
  • Plath, D. (2013). Organizational processes supporting evidence-based practice. Administration in Social Work, 37(2), 171–188. https://doi.org/10.1080/03643107.2012.672946
  • Plath, D. (2014). Implementing evidence-based practice: An organisational perspective. British Journal of Social Work, 44(4), 905–923. https://doi.org/10.1093/bjsw/bcs169
  • Poort, H., Onghena, P., Abrahams, H. J. G., Jim, H. S. L., Jacobsen, P. B., Blijlevens, N. M. A., & Knoop, H. (2019). Cognitive behavioral therapy for treatment-related fatigue in chronic myeloid leukemia patients on tyrosine kinase inhibitors: A mixed-method study. Journal of Clinical Psychology in Medical Settings, 26(4), 440–448. https://doi.org/10.1007/s10880-019-09607-5
  • Qiu, J., Barton, E. E., & Choi, G. (2019). Using system of least prompts to teach play to young children with disabilities. The Journal of Special Education, 52(4), 242–251. https://doi.org/10.1177/0022466918796479
  • Ramos, S., & Ramos, J. A. (2019). Process of change and effectiveness of family constellations: A mixed methods single case study on depression. The Family Journal, 27(4), 418–428. https://doi.org/10.1177/1066480719868706
  • Raulston, T. J., Zemantic, P. K., Machalicek, W., Hieneman, M., Kurtz-Nelson, E., Barton, H., & Frantz, R. J. (2019). Effects of a brief mindfulness-infused behavioral parent training for mothers of children with autism spectrum disorder. Journal of Contextual Behavioral Science, 13, 42–51. https://doi.org/10.1016/j.jcbs.2019.05.001
  • Renvall, K., & Nickels, L. (2019). Using treatment to improve the production of emotive adjectives in aphasia: A single-case study. Aphasiology, 33(11), 1348–1371. https://doi.org/10.1080/02687038.2019.1643001
  • Rochat, L., Manolov, R., Aboulafia-Brakha, T., & Berner-Burkard, C. (2019). Reducing anger outbursts after a severe TBI: A single-case study. Neuropsychological Rehabilitation, 29(1), 107–130. https://doi.org/10.1080/09602011.2016.1270837
  • Rubin, A., Parrish, D. E., & Washburn, M. (2016). Outcome benchmarks for adaptations of research-supported treatments for adult traumatic stress. Research on Social Work Practice, 26(3), 243–259. https://doi.org/10.1177/1049731514547906
  • Ruiz, F. J., Beltrán, D. M. G., & Cifuentes, A. M. (2019). Single-case experimental design evaluation of repetitive negative thinking-focused acceptance and commitment therapy in generalized anxiety disorder with couple-related worry. Revista Internacional De Psicología Y Terapia Psicológica, 19(3), 261–276.
  • Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W. M. C., & Haynes, W. S. (2000). Evidence-based medicine: How to practice and teach EBM. Churchill Livingstone.
  • Saddler, B., Asaro-Saddler, K., Moeyaert, M., & Cuccio-Slichko, J. (2019). Teaching summary writing to students with learning disabilities via strategy instruction. Reading & Writing Quarterly, 35(6), 572–586. https://doi.org/10.1080/10573569.2019.1600085
  • Satake, E., Maxwell, D. L., & Jagaroo, V. (2008). Handbook of statistical methods: Single subject design. Plural Publishing.
  • Scheurich, J. A., Beidel, D. C., & Vanryckeghem, M. (2019). Exposure therapy for social anxiety disorder in people who stutter: An exploratory multiple baseline design. Journal of Fluency Disorders: Official Journal of the International Fluency Association, 59, 21–32. https://doi.org/10.1016/j.jfludis.2018.12.001
  • Sepers, A. J. W., van der Werff, V., de Roos, C., Mooren, T., & Maric, M. (2019). Increasing family safety and decreasing parental stress and child’s social-emotional problems with resolutions approach: A single-case experimental design study protocol. Journal of Family Violence, 35(5), 527–536. https://doi.org/10.1007/s10896-019-00057-z
  • Smith, J. A., & Osborn, M. (2015). Interpretative phenomenological analysis as a useful methodology for research on the lived experience of pain. British Journal of Pain, 9(1), 41–42. https://doi.org/10.1177/2049463714541642
  • Spence, C., Kellett, S., Totterdell, P., & Parry, G. (2019). Can cognitive analytic therapy treat hoarding disorder? an adjudicated hermeneutic single-case efficacy design evaluation. Clinical Psychology & Psychotherapy, 26(6), 673–683. https://doi.org/10.1002/cpp.2390
  • Spreen, M., Timmerman, M. E., Ter Horst, P., & Schuringa, E. (2010). Formalizing clinical decisions in individual treatments: Some first steps. Journal of Forensic Psychology Practice, 10(4), 285–299. https://doi.org/10.1080/15228932.2010.481233
  • Tate, R. L., Perdices, M., Rosenkoetter, U., Shadish, W., Vohra, S., Barlow, D. H., Horner, R., Kazdin, A., Kratochwill, T., McDonald, S., Sampson, M., Shamseer, L., Togher, L., Albin, R., Backman, C., Douglas, J., Evans, J. J., Gast, D., Manolov, R. … Wilson, B. (2016). The Single-Case Reporting Guideline in Behavioural Interventions (SCRIBE) 2016 Statement. Physical Therapy, 96(7), e1–10. https://doi.org/10.2522/ptj.2016.96.7.e1
  • Thigpen, C. (2019). Measurement validity of retrospective survey questions of bicycling use, attitude, and skill. Transportation Research, 60, 453–461. https://doi.org/10.1016/j.trf.2018.11.002
  • Thomas, D. R. (2017). Feedback from research participants: Are member checks useful in qualitative research? Qualitative Research in Psychology, 14(1), 23–41. https://doi.org/10.1080/14780887.2016.1219435
  • Tsao, L. (2019). Brothers as playmates for their siblings with developmental disabilities: A multiple-baseline design study. Child & Youth Care Forum, 49(3), 409–430. https://doi.org/10.1007/s10566-019-09534-4
  • Turner, M. J., & Davis, H. S. (2019). Exploring the effects of rational emotive behavior therapy on the irrational beliefs and self-determined motivation of triathletes. Journal of Applied Sport Psychology, 31(3), 253–272. https://doi.org/10.1080/10413200.2018.1446472
  • Van der Zwet, R. J. M., Kolmer, D. M., Schalk, R., & Van Regenmortel, M. R. F. (2020). Implementing evidence-based practice in a Dutch social work organisation: A shared responsibility. British Journal of Social Work, 50(7), 2212–2232. https://doi.org/10.1093/bjsw/bcz125
  • Wampold, B. E. (2015). How important are the common factors in psychotherapy? an update. World Psychiatry, 14(3), 270–277. https://doi.org/10.1002/wps.20238
  • Wen, X., Eiden, R. D., Justicia-Linde, F., Wang, Y., Higgins, S. T., Thor, N., Haghdel, A., Peters, A. R., & Epstein, L. H. (2019). A multicomponent behavioral intervention for smoking cessation during pregnancy: A nonconcurrent multiple-baseline design. Translational Behavioral Medicine, 9(2), 308–318. https://doi.org/10.1093/tbm/iby027
  • Withiel, T. D., Stolwyk, R. J., Ponsford, J. L., Cadilhac, D. A., & Wong, D. (2019). Effectiveness of a manualised group training intervention for memory dysfunction following stroke: A series of single case studies. Disability and Rehabilitation, 42(21), 3033–3042. https://doi.org/10.1080/09638288.2019.1579260
  • Zimmerman, K. N., Ledford, J. R., Gagnon, K. L., & Martin, J. L. (2020). Social stories and visual supports interventions for students at risk for emotional and behavioral disorders. Behavioral Disorders, 45(4), 207–223. https://doi.org/10.1177/0198742919874050

Appendix A

Table A1. Characteristics of SC(Q)ED-Studies in 2019 in the Social Domain.