2,537
Views
1
CrossRef citations to date
0
Altmetric
EBP Advancement Corner

Is there a science to facilitate implementation of evidence-based practices and programs?Footnote

&

Abstract

Funding agencies, such as National Institutes of Health and the Institute of Education Science have recognized the need to improve the uptake of empirically supported practices into everyday service delivery in real-world settings. Implementation Science is a new discipline that seeks to remedy this problem through careful investigation using traditional and new research methods. We begin this Special Issue on Implementation Science with some of the definitions forwarded to help us understand the distinction among different types of research that are fundamental to Implementation Science. This introductory article discusses some of the traditional assumptions that have been called into question by this new discipline. We then outline some of the highlights of the four subsequent articles in this issue. The final section of this paper summarizes some of the ideas that are likely to shape the future of research that promises to advance evidence-based communication intervention practices.

This special issue on Implementation Science as it relates to communication assessment and intervention reflects growing awareness of the challenges in gaining acceptance and achieving widespread implementation of high-quality services and programs in real-life contexts. This preface seeks to highlight needs that exist and the changes we foresee for addressing those needs from two perspectives: those of the authors who have contributed to this special issue and the discipline of Implementation Science more generally.

Despite the growing popularity of terms like dissemination and implementation (D & I), the terminology remains murky. Implementation Science is defined as a field of study that investigates methods to promote the integration of research findings and evidence into policy and practice in systems such as health care or educational organizations. Implementation Science is meant to subsume dissemination and implementation research. Glasgow et al. (Citation2012) offer definitions from an NIH perspective.

Dissemination research is the scientific study of targeted distribution of information and intervention materials to a specific public health or clinical practice audience. The intent is to spread knowledge and the associated evidence-based interventions. The active process of dissemination is distinguished from the more passive process of “naturalistic” diffusion that occurs without concerted promotion.

Implementation research is the scientific study of the use of strategies to adopt and integrate evidence-based interventions into clinical and community settings to improve patient outcomes and benefit population health.

Effectiveness research is similar to dissemination and implementation research in its emphasis on adaptation and testing in real-world settings and with diverse populations, but it does not explicitly focus on understanding the spread and adoption of these intervention strategies.

As Glasgow et al. (Citation2012) point out that these types of research are much stronger in tandem than in isolation. In fact, we would argue that even efficacy research, which is not necessarily conducted in real-world settings, would be strengthened by anticipating and considering issues that relate to dissemination and implementation processes.

Traditional assumptions

Some may question whether there is a science of implementation. Rather, it is safe to say that it is a fledgling science that arguably got its start at the beginning of the twenty-first century, despite many researchers who wrote about related topics long before that (e.g. Rogers, Citation1962; Wolf, Citation1978). The journal, Implementation Science, began publishing in 2005. The seminal review of implementation research published by Fixsen, Naoom, Blase, and Friedman (Citation2005) is comprised largely of research published since 2000. The advent of Implementation Science seems to stem from the questioning of a number of traditional assumptions in the scientific community.

First, one could argue that Implementation Science developed as a response to the failed assumption that the traditional research pipeline would lead to implementation of evidence-based practices. The assumption was that basic research would generate a knowledge base that would lead to clinical or education research focusing on efficacy and eventually effectiveness, and that research would lead to changes in clinical and community practice. Thus, we would expect evidence-based practices to have a genesis in research that might not be focused on application, but would ultimately be manifested in improved health and educational outcomes. Various models of how this might come about and what types of research are suited to different steps in this process have long been debated (e.g. Goldstein, Citation1990; Robey, Citation2004; Stokes, Citation1997). It is fair to say that this traditional research pipeline is exceedingly long and perhaps with too many holes to produce solutions to our clients’ communication challenges and our clinicians’ practice needs in a timely manner (Green, Ottoson, García, & Hiatt, Citation2009). Schliep, Alonzo, and Morris (Citation2017) make this point, noting “that after 17 years, only 14% of healthcare research was adopted into day-to-day clinical practice following the traditional clinical research pipeline (Balas & Boren, Citation2000).”

A second assumption is that evidence-based practices exist and our shortcoming is the failure to get them implemented. Two fundamental challenges disrupt this assumption. The first is that calls for expanding production of efficacy research in the discipline of Communication Sciences and Disorders only predate the call for implementation research by a few decades (e.g. Goldstein, Citation1990). One of the concerns expressed by scientists is that perhaps this focus on Implementation Science will detract from the need to produce much needed, high-quality efficacy research in the field. The second challenge is that evidence-based practices are not static. We should expect evidence and evidence-based practices to evolve and be refined with further research and implementation. We need to acknowledge that a healthy system of service delivery must accept and advocate for a dynamic, constantly changing discipline. Evidence-based practices and their implementation will not proceed one practice or test at a time. For continual improvement in procedures and policies to address communication disorders to progress with the urgency needed, a growing effort to produce efficacy research coupled with dissemination and implementation research is needed.

The third assumption is that robust interventions will change the behavior of the vast majority of individuals with communication disorders for whom the interventions are designed, as well as those responsible for implementing those practices. This assumes more homogeneity in consumers (be they clients, service providers, administrators, policy-makers, or other stakeholders) than is warranted. We need to understand the conditions under which interventions are effective. For example, scientific studies may show that interventions have applicability to more clients than originally expected; likewise, we may learn that individual participant characteristics indicate with whom interventions are less effective. Investigators also may learn that a host of contextual factors influence when and under what contexts robust interventions are effective and ineffective. Thus, adaptations to evidence-based practices may need to respond to differences in clients, settings, tasks, dosage, as well as other refinements. Adaptations in practices may compete with the need to implement with fidelity. Scientific study is needed to sort out what variations are most effective. Overall, this third assumption seems to ignore the state of our clinical science as well as Implementation Science.

Fourth, we assume that the success of evidence-based practices once well-implemented will enjoy sustainability because the success of clients will reinforce the continued implementation of those practices. The literature on professional development has long recognized that clinician’s knowledge of effective interventions is not sufficient. Modeling, scaffolded practice and/or feedback are typically needed to prepare clinicians to implement with appropriate fidelity; even then, their organizations must support changes in procedures and policies within their service delivery systems for changes to be sustained. The assumption of sustainability ignores the many organizational factors and external contingencies that can impinge on practices and policies in real-world settings.

These are among the issues that motivated the articles contributed to this special issue on Implementation Science. Each of the articles previewed below offer perspectives on ways to move the field forward as we strive to develop, evaluate, and implement evidence-based practices in communication assessment and intervention.

Preview of articles

Olswang and Goldstein (Citation2017) discuss the benefits and challenges of establishing researcher–stakeholder collaborations. They argue that such collaborations will have benefits for advancing science as well as practice. Stakeholders help ensure that research undertaken has relevance to practical problems. However, researchers who are seeking to solve practical problems can capitalize on their efforts to help us understand mechanisms underlying behavior change. This seems to be consistent with what Stokes (Citation1997) refers to as “use-inspired basic research.” This notion of generating fundamental understanding while addressing practical problems applies as well to advancing the science of implementation as we seek to identify factors that facilitate and impede implementation, acceptance, and maintenance. Olswang and Goldstein highlight the potential of partnerships that include scientific clinicians and clinical scientists who together can engender a stimulating environment of inquisitiveness and practicality.

Campbell and Douglas (Citation2017) discuss the challenges of getting practitioners to adopt and implement evidence-based practices. They argue that Implementation Science can inform those processes, especially if interactive approaches are used to overcome the ineffectiveness of passive implementation strategies, such as didactic educational sessions. Interactive implementation strategies found to be effective include audit and feedback, educational outreach, and reminders. Campbell and Douglas draw upon Cochrane Effective Practice and Organization of Care Group’s reviews of the literature. They highlight 19 implementation strategies characterized as involving (a) evaluating and monitoring the quality of services, (b) educating clinicians or patients, and (c) planning or preparing for practice change. How best to conceptualize these strategies and package them in effective ways to optimize implementation efforts represents a future research need of Implementation Science. A key feature of these interactive approaches is that they seek to establish new expectations within organizational systems through information sharing, prompting, and feedback that reflect new “norms” for sustaining effective practices in real-world settings.

Schliep et al. (Citation2017) acquaint readers with innovative research designs and methods that represent advances in implementation research. Pragmatic designs, hybrid designs, qualitative methods, and mixed methods go beyond the standard designs used to study efficacy and effectiveness, such as randomized control trials. These research approaches hold promise in helping us develop an understanding of implementation processes that address barriers and facilitators to adoption by providers and organizational systems. The authors discuss many of the variations and other considerations of these designs and methods in the context of the RE-AIM Implementation Science framework (i.e. Reach, Effectiveness, Adoption, Implementation, and Maintenance). Schliep et al. provide illustrations of these variations using hypothetical cases of implementation of a swallowing screening protocol in a healthcare setting and a literacy intervention in an educational setting.

The final paper by Kincaid and Horner (Citation2017) draws upon their extensive experience in scaling up an evidence-based educational intervention (i.e. school-wide Positive Behavior Interventions and Supports—PBIS) in over 23,000 schools across the U.S. They highlight the importance of Implementation Science for scaling up PBIS and offer lessons learned that might be applied to scaling up other innovative communication assessment and intervention approaches. Over the past few decades they have learned that sustaining significant educational outcomes takes more than effective innovations and effective implementation with high fidelity; it also requires enabling contexts that predict where PBIS will succeed. Leadership teams need to be established that manage systems issues (e.g. funding, political realities, dissemination, policy alignment) as well as practical issues (e.g. professional development, evaluation and performance feedback, and content expertise). Perhaps the most illuminating lesson for Implementation Science is questioning the assumption of a “tipping point,” where social systems change is sustained through momentum. Kincaid and Horner argue that the complexities of educational systems at school, district, and state levels call for constant vigilance to adapt to their ever-changing nature. This is because numerous internal and external factors can threaten implementation and sustainability even in the face of compelling outcome data.

Acknowledging realties of implementation

The evidence-based practice movements within health care and education have been responding, at least in part, to the call for greater accountability and cost-effective services. Perhaps this motivation has resulted in an overemphasis on evaluating existing intervention protocols. This may be shortchanging the scientific process. Scientific progress might best be served through a process of iterative development and evaluation of communication interventions to maximize their effects. From this perspective, one could argue that too many resources have gone into prematurely using randomized control trials to compare interventions that have not been adequately developed and refined through careful research.

A recognition that evidence and evidence-based practices are not static phenomena is needed to advance our science. Perhaps researcher–stakeholder collaborations will inform the process of improving practices and developing new, more effective practices. This evolution of evidence-based practices will require new implementation efforts. Better alternatives or findings that dispute the efficacy of existing practices should also result in de-implementation of less effective or ineffective practices. Researchers and clinicians alike should embrace and accept a dynamic process of continuous discovery that is empirically based. Yet, this special issue helps us understand that knowledge of evidence-based practices is not sufficient.

The authors of articles in this issue stress the need to focus on how we get clinicians and the agencies responsible for service delivery to implement evidence-based practices. New skill sets are needed if we are to mobilize enlightened researchers and stakeholders who can nurture productive partnerships and are equipped to navigate the complex steps in the implementation process. For example, incorporating pragmatic and hybrid designs and expanding our use of mixed methods discussed by Schliep et al. represent new paradigms for advancing knowledge generation for efficacy and implementation concurrently. Incorporating research on implementation as early as possible into the intervention development and evaluation process holds tremendous promise. Research–stakeholder partnerships (as discussed by Olswang and Goldstein) and approaches to identifying successful implementation strategies used in combination (as discussed by Campbell and Douglas) will change behavior and organizations’ expectations.

We must recognize that there is a great deal of heterogeneity in all these systems at a local level. However, the scale up of evidence-based practices is not restricted to small-scale ventures. Indeed, our intention is for highly effective and feasible practices to be implemented in healthcare and education systems in ways that meet the communication needs across the globe. It is difficult to know how this process will proceed, especially if Kincaid and Horner are right in arguing that there is “no tipping point” when it comes to implementation in complex service delivery systems. We live in a market economy where there is competition and promotion of alternative interventions even if existing programs are effective. Implementation Science may have to address how to keep evidence-based practices fresh and marketable from a dissemination and implementation standpoint. Notably, sustainability should not be taken for granted.

Taken collectively, the articles in this special issue present alternatives to the traditional research pipeline. The authors offer alternative approaches that are meant to overcome the persistently long timeline for research to be assimilated into practice. We posit that researchers who consider implementation during intervention development will advance science and narrow the research to practice gap more quickly. We hope that this special issue will provide new insights into the future of evidence-based communication assessments and interventions. These articles make us aware of new complexities in conducting research that must be addressed for researchers to impact the discipline in significant ways. This issue also reveals new opportunities for forming exciting collaborations and partnerships that have the potential to make our research better. It is our hope that readers will find new inspiration to advance science and ensure that science contributes to solving important societal problems more efficiently.

Declaration of interest: No potential conflict of interest was reported by the authors.

Notes from the Editors

We are grateful to Drs. Howard Goldstein and Lesley Olswang who served as Guest Editors for this Special Issue on Implementation Science. They have recruited an excellent group of authors and have brought to bear their outstanding scholarly and editing skills so that our readership can benefit from these contributions in their current form. We would also like to express our appreciation for each of the authors who contributed to the special issue.

Notes

This work was in part supported by a Research Partnership grant supported by the Institute of Education Sciences, U.S. Department of Education [grant number R305H160034] to the University of South Florida. The opinions expressed are those of the authors and do not represent views of the Institute or the U.S. Department of Education.

References

  • Balas, E. A. & Boren, S. A. (2000). Managing clinical knowledge for health care improvement. Yearbook of medical informatics. Stuttgart, Germany: Schattauer Verlagsgesellschaft mbH.
  • Campbell, W. N., & Douglas, N. F. (2017). Supporting evidence-based practice in speech-language pathology: A review of implementation strategies for promoting health professional behavior change. Evidence-Based Communication Assessment and Intervention, 11(3–4), 1–10.
  • Fixsen, D. L., Naoom, S. F., Blase, K. A., & Friedman, R. M. (2005). Implementation research: A synthesis of the literature (FMHI Publication #231). Tampa: University of South Florida, The National Implementation Research Network.
  • Glasgow, R. E., Vinson, C., Chambers, D., Khoury, M. J., Kaplan, R. M., & Hunter, C. (2012). National institutes of health approaches to dissemination and implementation science: Current and future directions. American Journal of Public Health, 102(7), 1274–1281. doi:10.2105/AJPH.2012.300755
  • Goldstein, H. (1990). The future of language science: A plea for language intervention research. ASHA Reports, 20, 41–50.
  • Green, L., Ottoson, J., García, C., & Hiatt, R. (2009). Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annual Review of Public Health, 30, 151–174. doi:10.1146/annurev.publhealth.031308.100049
  • Kincaid, D., & Horner, R. H. (2017). Changing systems to scale up an evidence-based educational intervention. Evidence-Based Communication Assessment and Intervention, 11(3–4).
  • Olswang, L. B., & Goldstein, H. (2017). Collaborating on the development and implementation of evidence-based practices: Advancing science and practice. Evidence-Based Communication Assessment and Intervention, 11(3–4).
  • Robey, R. R. (2004). A five-phase model for clinical-outcome research. Journal of Communication Disorders, 37(5), 401–411. doi:10.1016/j.jcomdis.2004.04.003
  • Rogers, E. (1962). Diffusion of Innovations. New York, NY: Free Press of Glencoe.
  • Schliep, M. E., Alonzo, C. N., & Morris, M. A. (2017). Beyond RCTs: Innovation in research design and methods to advance implementation science. Evidence-Based Communication Assessment and Intervention, 11(3–4).
  • Stokes, D. E. (1997). Pasteur’s quadrant—Basic science and technological innovation. Washington, DC: Brookings Institution.
  • Wolf, M. (1978). Social validity: The case for subjective measurement. Journal of Applied Behavior Analysis, 11(2), 203–214. doi:10.1901/jaba.1978.11-203

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.