Abstract
In 2020, the What Works Clearinghouse (WWC) for single-case experimental design (SCED) studies released an updated version of their standards and procedures handbook (Version 4.1). Because of these updates, there is a need to understand the implications for the field in terms of quality rating of the design, and subsequent synthesis of evidence at the study and meta-analytic level. This study provides a comparison between the previous SCED design and evidence standards, and the updated ones published in 2020 as Version 4.1. We are interested in whether Version 4.1 results in differences in terms of (1) quality rating of the design and (2) analysis and meta-analysis of research evidence. The results indicate no differences related to quality rating of the design, but there are notable differences in terms of how evidence is analyzed, synthesized, and reported. This is further illustrated using a selected publication, namely Pivotal Response Training in which research evidence is meta-analyzed related to the effectiveness of Pivotal Response Training as an intervention to increase communication of children (aged 2–18) with autism. Based on the findings, recommendations and implications are discussed.
Acknowledgement
The authors sincerely acknowledge Daniel Swan for providing initial feedback to this manuscript.
DISCLOSURE STATEMENT
The commentary authors report no conflicts on interest and are solely responsible for the content and writing of this commentary.
SUPPLEMENTARY MATERIAL
Supplemental data for this article can be accessed online at https://doi.org/10.1080/17489539.2022.2139148.