4,373
Views
15
CrossRef citations to date
0
Altmetric
Research Article

Reflections on project ECHO: qualitative findings from five different ECHO programs

ORCID Icon, , , , &
Article: 1936435 | Received 26 Mar 2021, Accepted 26 May 2021, Published online: 02 Jun 2021

ABSTRACT

Project ECHO (Extension for Community Healthcare Outcomes) was developed in 2003 as an innovative model to facilitate continuing education and professional development. ECHO emphasizes ‘moving knowledge, not people.’ To accomplish this, ECHO programs use virtual collaboration and case-based learning to allow practitioners, including those in rural and underserved areas, to receive specialist training. The ECHO model has expanded rapidly and is now used in 44 countries. Preliminary research on ECHO’s efficacy and effectiveness has shown promising results, but evidence remains limited and appropriate research outcomes have not been clearly defined. To improve the evidence basis for ECHO, this study of 5 ECHO programs (cancer prevention/survivorship, integrated pain management, hepatitis C, HIV, and LGBTQ+ health care elucidated actionable insights about the ECHO programs and directions in which future evaluations and research might progress. This was a qualitative study following COREQ standards. A trained interviewer conducted 10 interviews and 5 focus groups with 25 unique, purposively sampled ECHO attendees (2 interviews and 1 focus group for each of the 5 programs). Data were transcribed verbatim and analyzed using the general inductive approach, then reviewed for reliability. We identified four major categories (reasons to join ECHO, value of participating in ECHO, ways to improve ECHO, and barriers to participation) composed of 23 primary codes. We suggest that thematic saturation was achieved, and a coherent narrative about ECHO emerged for discussion. Participants frequently indicated they received valuable learning experiences and thereby changed their practice; rigorous trials of learning and patient-level outcomes are warranted. This study also found support for the idea that the ECHO model should be studied for its role in convening communities of practice and reducing provider isolation as an outcome in itself. Additional implications, including for interprofessional education and model evolution, were also identified and discussed.

Introduction

Project ECHO (Extension for Community Healthcare Outcomes) was developed at the University of New Mexico by what is now the ECHO Institute[Citation1]. Though originally couched as a ‘disruptive innovation’ in medical education[Citation2], ECHO is on the way to becoming ubiquitous. ECHO links specialists and other experts with non-specialist practitioners, including those in rural and underserved areas[Citation3], using Zoom (Zoom Video Communications, Inc., San Jose, CA) or similar teleconferencing technology. The ECHO model is facilitated by ECHO Hubs, which are sites outside the ECHO Institute that utilize the model, must be approved and receive training from the Institute, and have to adhere to multiple procedural guidelines[Citation4]. Hubs offer programs or tracks (vernacular differs by Hub) that provide telementoring and case-based learning to support continuing education, specialty training, and transfer of knowledge and skills needed to resolve complex clinical conditions[Citation5]. A typical program might offer 60–90 minute sessions that include both a brief didactic learning presentation and collaborative review of 1 or more participant-submitted cases[Citation6]. The case-based learning approach provides an exemplar of shared problem-solving and allows facilitators to ‘understand participants’ clinical roles and [call] on [them] to share their expertise and experiences.’[Citation7]

Enthusiasm for the ECHO model has grown substantially; as of March 2021, there were 920 ECHO programs in 44 countries[Citation8]. Evaluations of early program adoptions beyond the ECHO Institute suggested high levels of promise and implementation feasibility [Citation9–11]. ECHO’s digital approach and standardized format were particularly well suited to the COVID-19 pandemic, during which the ECHO Institute was able to rapidly create 10 program tracks that included infectious diseases, critical care, and education[Citation12].

To some extent, the evidence basis for ECHO is still catching up with the rapid proliferation of programs. In a 2016 systematic review of ECHO[Citation13], most studies reported outcome data from levels 1 through 4 of Moore’s evaluation framework[Citation14], often using surveys to examine participation levels, provider satisfaction, changes in knowledge, and competence. Some studies also captured objective knowledge (multiple choice questions) or used interviews to assess self-confidence. Unrelated to Moore’s framework, some studies also examined motivators and barriers to participating in ECHO[Citation13]. A 2019 systematic review produced similar findings, which emphasized the importance of continuing to study ECHO[Citation15]. Research conducted since the latter review has included a randomized trial of ECHO for caring for patients with autism, which reported mixed findings on learning outcomes[Citation16], and a study suggesting that hepatitis C cure rates for non-specialists attending ECHO were not inferior to rates for specialists[Citation17].

ECHO is a complex education innovation, so it is reasonable that studies have focused on disparate outcomes and topics and have used different approaches. In 2020, an expert panel identified challenges to building the evidence basis for Project ECHO and other ECHO-like models (EELM), including the need to ‘develop a clear understanding of EELM, what they are intended to accomplish, and the critical components of EELM that are necessary to meet their goals’ as well as ‘reporting on a broader set of EELM program characteristics.’[Citation18]

Continued study of the ECHO model is warranted in multiple domains. While distal (patient-level) outcomes are important to study, research on ECHO qua medical education must clarify the outcomes that Hubs and participants hope to obtain from ECHO and the granular components that support programmatic efficacy. Multi-stage mixed methods approaches are likely to be useful at this stage; qualitative data can be used intentionally to develop and provide a framework for quantitative research[Citation19]. Analyses of qualitative findings can then be used to inform the kinds of questions that can be asked, and the hypotheses that should be tested[Citation20], as well as providing additional actionable information.

Thus far, the preponderance of qualitative ECHO studies has focused on single programs (e.g. pain management). However, our ECHO Hub used a uniform approach to collect data from five different ECHO projects in 2020 and early 2021. Those programs were offered by the IUPUI ECHO Center at the Richard M. Fairbanks School of Public Health at Indiana University-Purdue University, Indianapolis (IUPUI). Each of the programs had a different expert panel but shared key staff members, such as the project managers and the evaluator. The ECHO programs independently focused on cancer prevention/survivorship, integrated pain management, hepatitis C (HCV), human immunodeficiency virus (HIV), and health care for members of the LGBTQ+ community. By using a standardized approach to collect data from ECHO programs focused on disparate topics, the objective of this report was to elucidate both actionable insights about the ECHO programs as well as directions in which future evaluations and research might reasonably progress. In doing so, we investigated several broad questions: (1a) How do ECHO attendees perceive ECHO, including general, positive, and negative perceptions? (1b) How would ECHO attendees propose to change the program to correct perceived deficits? (2) Have ECHO attendees changed their professional practice since participating in ECHO, and if so, how? And finally, (3) How would an ‘ideal’ ECHO program appear and function?

Methods

Study methods are reported according to the Consolidated Criteria for Reporting Qualitative Studies (COREQ)[Citation21].

Research team and reflexivity

Interviews and focus groups were conducted by JD (DPT, MSW, MHS, NBC-HWC), a training specialist and research associate at Prevention Insights in the Indiana University School of Public Health Bloomington. JD has extensive experience conducting interviews and focus groups and is also a member of the Motivational Interviewing Network of Trainers (MINT)[Citation22]. JD did not have relationships with the participants/interviewees but was a participant in the ECHO program, and so was familiar to some respondents. They introduced themself and their role at the outset of each instance of data collection. They were selected to conduct data collection due to their experience with the ECHO program and conversational expertise through MINT.

Study design

Participants and data collection

Participants were identified purposively by one author (AJ, Director of the IUPUI ECHO Center), separately by type of ECHO. Though purposive sampling typically is used to gather data from hard-to-reach populations[Citation23], here the purpose was to ensure that invitees were sufficiently active in their ECHO program that they could provide in-depth feedback. Once participants had been identified, they were recruited to participate in either a focus group or an individual interview by another author (JD) using e-mail.

The decision to use both focus groups and individual interviews was made prior to the initiation of the study. This decision was both philosophical, as data gathered in the context of participant interaction conceptually differs from data gathered 1:1[Citation24], and pragmatic, since a large, randomized comparison of the two approaches found that interviews may produce a wider range of categories but focus groups may elicit more sensitive themes[Citation25]. As such, recruitment into interviews or focus groups was not differentiated and was based on scheduling and availability of participants.

A total of 15 data collection instances were completed (2 interviews and 1 small focus group for each of 5 ECHO programs). A total of 25 unique individuals provided data for the project, with an additional 3 seeming to agree to participate but not attending. Data collection occurred from May 2020 through February 2021; all recruitment was digital, and all conversations occurred over Zoom. Data collection was scheduled for each program to correspond with the end of the annual ‘program cycle.’

Topics and prompts used in the semi-structured guide were developed a priori by the study authors based on their experience with running ECHO programs since 2018. JD authored the first drafts, JA and AJ revised them, and then the team reviewed them several times for clarity and consistency (see Supplement 1). Partway through the project, several additional prompts were added to the guide to support quality improvement (noted in Supplement 1); however, qualitative review indicated that these points were typically addressed by participants without needing a prompt. The same guide was used for focus groups and interviews.

Theoretical assumptions

The research team used the general inductive approach [Citation26] to develop categories and codes. In contrast to approaches using deductive analysis, which would focus on exploring predefined theories or frameworks, the general inductive approach emphasizes ‘allow[ing] findings to arise directly from analysis of raw data’ in a way that is ‘relevant to evaluation or research objectives.’[Citation26] Procedurally, the approach included ‘preparation of raw data files,’ ‘close reading of the text,’ ‘creation of categories from multiple readings,’ and ‘continuing refinement.’[Citation26] It also bore similarity to, but was not fully consistent with, inductive constant comparison analysis[Citation27]. Finally, the modification of the question set partway through the study in response to emergent themes was consistent with grounded theory, but this was not a grounded theory analysis[Citation28].

Transcription and saturation

All conversations were videorecorded (with permission) and were scheduled for 60 minutes, with some variability in actual length. All data were professionally transcribed by a vendor. Data saturation was not discussed at the outset of the project due to the nature of the original purpose of data collection. However, in inductive qualitative research, the standard for saturation typically is ‘theoretical saturation,’ meaning that analyzing additional data does not result in the identification of new categories or codes[Citation29]. For this study, no additional categories or codes were identified while analyzing the final three transcripts out of 15. Thus, inductive thematic saturation might be inferred[Citation29].

Data analysis

All data from interviews and focus groups were coded in aggregate by JA. This was an inductive study focused on broad elucidation of ideas, so all relevant categories and codes that were identified were included in the emergent codebook. Iterative coding included a review for themes by TJ (three data sets) and JR (all data sets), and a final concordance review by AJ (three data sets). These secondary reviews included two individuals (TJ and JR) who were familiar with but not stakeholders in ECHO, and one individual (AJ) who was both a stakeholder and member of the ECHO program, providing triangulated assessment of the analytic credibility[Citation26]. The final codebook consisted of four categories filled with 23 primary codes (see ) that were developed during analysis. Consistent with the general inductive approach, instances of text were not required to belong exclusively to one code[Citation30]. Rather, the codes and the categories were reflective of the overall concepts identified in the data that were collected. However, for clarity, specific statements that exemplify each of the codes within each category are provided in .

Table 1. Codebook and definitions

Table 2. Exemplar quotes

Results

Motivations to join ECHO

General continuing education (CE)

Participants most often indicated that they sought out one or more ECHO programs because they wanted to pursue continuing education. They thought that ‘the format was convenient and low stakes’ (HIV Focus Group) and expressed ‘interest … to pick up on ancillary support’ (IPM Focus Group), meaning areas outside of their current specialty.

Help for rural/remote providers

Other participants, especially those attending the HIV and HCV programs, noted that issues with access to training and resources were common in rural and remote areas, including low rates of treatment, so ECHO was a convenient way to begin the process of providing access.

Intention to develop networking

A few providers also specifically sought out ECHO to build their practice networks (e.g. ‘working with care coordinators and providers from all different corners of the state’ [HIV Focus Group]).

Need for CE credits

Others identified free continuing education credits as a motivator to explore ECHO.

Value of ECHO

Networking

Many participants indicated that the networking afforded by ECHO was valuable. For some participants, it was the fact that ‘I can reach out to the ECHO … ’ (LGBTQ+ Interview 2), whereas others saw ECHO as ‘kind of the 21st century version of the old doctor’s lounge’ (HIV Focus Group). In certain cases, the networking was described as going beyond the participant and involving their colleagues who did not attend ECHO. Such non-attenders were described as reaching out to an ECHO attendee to obtain information and the general spread of the content beyond the group itself (‘Even though the person wasn’t on ECHO themselves, those tools are getting shared and spread … ’ [IPM Focus Group]).

Met a technical, legal, or CE requirement

Some participants appreciated that CE was available for ‘the different professions each time’ (HIV Focus Group) or that it counted, in some cases, as professional supervision.

Structure of ECHO

ECHO Hubs are trained by the ECHO Institute, and so programs have a similar ebb and flow. Some participants ‘like that rhythm and the outline’ (HIV Interview 2) and felt that the whole program setup was ‘well organized [with a] casual and yet professional, friendly milieu’ (LGBTQ+ Interview 2).

Information from didactics

Although ECHO is a multifaceted program, participants noted that the lecture components of the program were valuable, and that ‘the variety of presenters and perspectives … [was] very helpful’ (Cancer Interview 1).

Being able to present/address difficult cases

Participants expressed that ‘[they] have been able to present some of [their] clients and get really good feedback’ (HIV Interview 1) and that ECHO is great support because ‘for difficult cases’ it can be a ‘hand to hold’ (HCV Interview 1).

Interprofessional nature of the program and lack of hierarchy

A substantive portion of the discussion about ECHO’s value revolved around the wide variety of disciplines and perspectives involved. They felt that ‘it’s wonderful because you get interaction with all these other people … not only just the other doctors because [they] all kind of are taught from the same book’ (Cancer Interview 2). This was seen as leveraging ‘different skill sets’ (IPM Interview 2) and enabling practitioners to understand the ‘hurdles [others] face’ (HCV Focus Group). In many cases, these comments extended specifically to the lack of hierarchy in ECHO programs, and that ‘everybody’s comments are valued’ (Cancer Interview 1).

Access to expert opinion

Some comments noted the value of having highly skilled participants attend the ECHO or serve as experts. This included both practical outcomes, such as identifying when a client was taking contraindicated medications (HIV Interview 1) and emphasis on best practices: ‘Well, nobody’s talking about 2018. We’re talking about July 2020 … it’s the most up-to-date’ (Cancer Focus Group).

Changes to professional practice

Many respondents indicated either that their personal practice had changed or improved or that their overall practice environment had changed because of ECHO, such as ‘doing [their] own case presentations or case conferences’ (HIV Focus Group).

Ways ECHO can improve

Facilitate networking

There was some interest in having ECHO programs facilitate networking by publishing or sharing attendee information with other attendees.

Structural changes

There were four different sub-codes related to structural changes that could potentially be made to the ECHO program. These included:

(i) Record didactic presentations. There was a high level of interest in accessing recorded didactics for reference. While recordings were always passively available for participants, this feedback led the Hub to directly disseminate recordings via a follow-up communication after each ECHO and select a new cloud service to distribute this resource.

(ii) Modify Didactic Presentations. Some participants were interested in longer didactic sessions on occasion, perhaps to ‘go in depth about what we can do to help our clients’ (HIV Interview 1), because ‘even if it’s a very specific topic, 20 minutes [for a didactic session] … is not enough time’ (LGBTQ+ Focus Group).

(iii) Allow More Time for Case Management. Conversely, other participants felt that ‘cases get cut off at the end because the didactics can sometimes go … over’ (HIV Interview 1) and hoped for more time, while acknowledging that ‘everyone’s busy … ’ (HCV Focus Group).

(iv) Reconceptualize or Minimize Introductions. A few participants wanted the introductions to be configured to take less time. Jokingly, one person noted, ‘as we added more and more people, by the time we get done with introductions, [the ECHO] is going to be over’ (LGBTQ+ Interview 2).

Expanding the ECHO community

We observed a high volume of conversation about ways to extend ECHO outside of the sessions themselves. Typically, this included shared workspaces or message boards, but also broader concepts like ‘local ECHOs’ paired with ‘larger ECHOs’ (LGBTQ+ Focus Group).

Consider braiding telemedicine with ECHO

Although it was generally seen as difficult to achieve, some participants wanted to see fewer barriers and distance between patients and the ECHO itself, including live, digital case management.

Work actively to facilitate follow-up

A few participants were interested in more intentional follow-up on both case management and on understanding from the didactic sessions.

Barriers to participating

Time

Although some participants elsewhere indicated that certain components could be longer, many participants found it difficult to locate 60 or 90 minutes during a workday, especially when they must justify the use of time to an employer (HCV Interview 1).

Scheduling

Many participants indicated barriers related to the time of day or day of the week the programs were offered.

Discussion

Collecting and analyzing data from five different ECHO programs produced a coherent and saturated examination of the ECHO model, leading further toward the desired ‘clear understanding’ and enumeration of ‘necessary components’[Citation18] for study. We note several implications for future research and ideas that ECHO programs might consider exploring, though these do not exhaustively reflect the Results.

(1) Researchers should study one or more outcome measures directly related to the formation of a professional network of practice and learning as an important end in itself. This might also include measures related to job satisfaction or feelings of isolation.

Many practitioners joined and valued ECHO because of the networking that it offered; this was not just a matter of accessing expert opinion but was seen, sometimes, as a replacement for traditional gathering places and a means to reduce practitioner isolation. This finding mirrors qualitative ECHO studies emphasizing community of practice and practitioner isolation [Citation31–35]. It also reflects prior work on andragogical spaces for medical professional identity[Citation36]. Not only did study participants indicate the value of networking, they suggested different ways that ECHO could further facilitate such, or build a community of practice and learning.

(2) Randomized trials of andragogical and patient-level outcomes are warranted, and the scope of outcomes in such studies potentially should be broadened.

Consistent with prior work, our findings suggest that ECHO participants are motivated by and receive quality continuing education [Citation31,Citation32,Citation34,Citation35,Citation37,Citation38] and report changes to their own practice [Citation31,Citation33,Citation35,Citation38]. Indeed, these outcomes often have been foci of quantitative studies collected by systematic reviews [Citation13,Citation15], but the strength of evidence should be increased. Further, our data suggest that both learning and practice change may extend beyond attendees to second- and third-degree contacts. This has been noted previously, but infrequently, in other qualitative work [Citation32–34]. Studies exclusively focused on attendees and their patients/clients may underestimate ECHO’s aggregated impact on outcomes.

(3) ECHO Hubs should consider measuring perceptions of interprofessional collaboration and education (IPC/IPE) when appropriate.

Some ECHO programs clearly offer IPE [Citation39–41] and have explored benefits of IPE[Citation40]. Our study, somewhat uniquely, identified wide-ranging support for the interprofessional components of ECHO, not only for the purposes of IPE but also as a means of leveling practitioner hierarchy in the healthcare education space. Use of a standardized IPE tool [Citation42] to measure participant perceptions may prove informative.

(4) Consider and experiment with ways that barriers to access can be overcome without diluting the model.

Unsurprisingly [Citation32,Citation34,Citation37], time commitment and scheduling were identified as barriers to participating in ECHO; however, the programs in this study were offered for different lengths (60/90 minutes) and on different days and times of day, lending credence to one participant’s comment that it may just ‘be the nature of the beast’ (Cancer Interview 2). Multiple amendments to ECHO were suggested to address this, including recording didactics and minimizing time spent on introductions. However, no granular assessment of the ‘necessary’ components of the ECHO model exists, so the degree to which altering core elements of the model affects outcomes must be studied.

Strengths and limitations

This study combined qualitative data from five different ECHO programs, thereby reducing the impact of any single program’s idiosyncratic experience on the results. In addition, given the large amount of published ECHO-based research focused on the first four levels of Moore’s framework, this study explicitly was designed to be generative – suggesting new trajectories – rather than focused on learning outcomes and practice change, per se.

However, these data should not, in and of themselves, drive any specific programmatic decisions. Since active ECHO participants were purposively sampled, these data are less likely to reflect perspectives of infrequent attenders. At the same time, there is some preliminary evidence that providers who attend ECHO programs infrequently may primarily be limited by scheduling difficulties[Citation43]. It is also possible that the authors’ own biases unduly affected the results; this risk was reduced by not utilizing the interviewer as a coder or analyst. The incorporation of different ECHO programs meant a wider diversity in perspectives about ECHO as a model, but some broad concepts (e.g. IPE) may be less relevant to an ECHO program focused tightly on a single topic or profession.

Conclusions

Most research on Project ECHO has focused on identifying participant-level or, more rarely, patient-level outcomes from Moore’s evaluation framework. This study responds to calls to better understand ECHO by analyzing standardized qualitative data from five different ECHO programs with a focus on understanding the ECHO model itself. We suggest four new or modified areas for future research and exploration of this promising andragogical approach.

Data availability

Please contact the authors regarding availability of the raw data in text format only. Sharing any transcript will be possible pending complete de-identification on behalf of both the participants and the interviewer, as well as removing other details. For reasons of confidentiality, video recordings of interviews and focus groups will not be made available.

Ethical approval

This study was approved by the Indiana University Institutional Review Board (#10643).

Author contributions

JA, JD, AJ, and AC conceptualized and designed the study. JD collected the data. JA, AJ, JR, and TJ analyzed and interpreted the data. GM and AJ obtained funding for the study. JA produced the first draft of the paper, but all authors contributed to drafting and refinement. All authors approved the final version of the manuscript.

Disclosure

No potential conflict of interest was reported by the author(s).

Supplemental material

Supplemental Material

Download MS Word (14.5 KB)

Acknowledgments

We would like to thank Ms. TiAura Jackson for her work in reviewing qualitative coding for a set of interviews and a focus group. We also wish to acknowledge all ECHO expert panelists, participants, and facilitators who were not part of this specific study but without whom the Fairbanks School of Public Health could not offer their ECHO programs.

Supplemental Material

Supplemental data for this article can be accessed here

Additional information

Funding

The IUPUI ECHO Center programs described in this proposal received funding or in-kind support from the from the Indiana University Grand Challenge: Responding to the Addictions Crisis, the Indiana Department of Health, specifically the Division of HIV, Viral Hepatitis and Harm Reduction, the Division of Trauma & Injury Prevention, and the Division of Chronic Disease, Primary Care, Rural Health, the Health Foundation of Greater Indianapolis, Indiana Immunization Coalition, Eskenazi Health, IU-Health, Riley Children’s Hospital, American Cancer Society, Indiana Cancer Consortium, and the Indiana Clinical and Translational Sciences Institute. The content is solely the responsibility of the authors and does not necessarily represent the official views of any of the listed organizations or programs.

References

  • Arora S, Geppert CMA, Kalishman S, et al. Academic health center management of chronic diseases through knowledge networks: project ECHO. Acad Med. 2007;82(2):154–9.
  • Arora S, Kalishman S, Thornton K, et al. Expanding access to hepatitis C virus treatment–Extension for Community Healthcare Outcomes (ECHO) project: disruptive innovation in specialty care. Hepatology. 2010;52(3):1124–1133.
  • Francis E, Kraschnewski J, Hogentogler R, et al. 4060 A telehealth approach to improving healthcare to rural and underserved populations. J Clin Transl Sci. 2020;4(S1):56.
  • UNM. Become an ECHO hub: run your own ECHO programs. University of New Mexico, ECHO Institute. 2021. Available from: https://hsc.unm.edu/echo/get-involved/start-a-hub/. Cited 2021 Mar 10.
  • Arora S, Kalishman SG, Thornton KA, et al. Project ECHO: a telementoring network model for continuing professional development. J Contin Educ Health Prof. 2017;37(4):239–244.
  • Agley J, Adams ZW, Hulvershorn LA. Extension for Community Healthcare Outcomes (ECHO) as a tool for continuing medical education on opioid use disorder and comorbidities. Addiction. 2019;114(3):573–574.
  • Doherty M, Rayala S, Evans E, et al. Using virtual learning to build pediatric palliative care capacity in South Asia: experiences of implementing a teleteaching and mentorship program (Project ECHO). JCO Glob Oncol. 2021;7:210–222.
  • UNM. ECHO impact and initiatives: touching one billion lives by 2025. University of New Mexico, ECHO Institute. 2021. Available from: https://hsc.unm.edu/echo/echos-impact/. Cited 2021 Mar 10.
  • Catic AG, Mattison MLP, Bakaev I, et al. ECHO-AGE: an innovative model of geriatric care for long-term care residents with dementia and behavioral issues. J Am Med Direct Assoc. 2014;15(12):938–942.
  • Khatri K, Haddad M, Anderson D. Project ECHO: replicating a novel model to enhance access to hepatitis C care in a community health center. J Health Care Poor Underserved. 2013;24(2):850–858.
  • Scott JD, Unruh KT, Catlin MC, et al. Project ECHO: a model for complex, chronic care in the Pacific Northwest region of the USA. J Telemed Telecare. 2012;18(8):481–484.
  • Katzman JG, Tomedi LE, Thornton K, et al. Innovative COVID-19 programs to rapidly serve New Mexico: project ECHO. Public Health Rep. 2021;136(1):39–46.
  • Zhou C, Crawford A, Serhal E, et al. The impact of Project ECHO on participant and patient outcomes: a systematic review. Acad Med. 2016;91(10):1439–1461.
  • Moore DE, Green JS, Gallis HA. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29(1):1–15.
  • McBain RK, Sousa JL, Rose AJ, et al. Impact of Project ECHO models of medical tele-education: a systematic review. J Gen Intern Med. 2019;34:2842–2857.
  • Mazurek MO, Parker RA, Chan J, et al. Effectiveness of the Extension for Community Health Outcomes Model as applied to primary care for autism: a partial stepped-wedge randomized clinical trial. JAMA Pediatr. 2020;174(5):e196306.
  • Rojas SA, Godino JG, Northrup A, et al. Effectiveness of a decentralized hub and spoke model for the treatment of hepatitis C virus in a federally qualified health center. Hepatol Commun. 2021;5(3):412–423.
  • Faherty LJ, Rose AJ, Chappel A, et al. Assessing and expanding the evidence base for project ECHO and ECHO-like models: findings of a technical expert panel. J Gen Intern Med. 2020;35(3):899–902.
  • Guetterman TC, Fetters MD, Creswell JW. Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Ann Fam Med. 2015;13(6):554–561.
  • Joyes J, Booth A, Moore G, et al. Synthesising quantitative and qualitative evidence to inform guidelines on complex interventions: clarifying the purposes, designs and outlining some methods. BMJ Glob Health. 2019;4(S1):e000893.
  • Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–357.
  • MINT. Welcome to the Motivational Interviewing Network of Trainers (MINT). 2021. Available from: https://motivationalinterviewing.org/. Cited 2021 Mar 3.
  • Barratt MJ, Ferris JA, Lenton S. Hidden populations, online purposive sampling, and external validity: taking off the blindfold. Field Methods. 2014;27(1):3–21.
  • Baillie L. Exchanging focus groups for individual interviews when collecting qualitative data. Nurse Res. 2019;27(2):e1633.
  • Guest G, Namey E, Taylor J, et al. Comparing focus groups and individual interviews: findings from a randomized study. Int J Social Res Methodol. 2017;20(6):693–708.
  • Thomas DR. A general inductive approach for analyzing qualitative evaluation data. Am J Eval. 2006;27(2):237–246.
  • Leech NL, Onwuegbuzie AJ. An array of qualitative data analysis tools: a call for data analysis triangulation. School Psychol Quart. 2007;22(4):557–584.
  • Charmaz K. Grounded Theory. In: Smith JA, Harre R, VanLangenhove L, editors. Rethinking Methods in Psychology. London: Sage Publications; 1996. p. 27–49.
  • Saunders B, Sim J, Kingstone T, et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant. 2018;52:1893–1907.
  • Liu L. Using generic inductive approach in qualitative educational research: a case study analysis. J Educ Learn. 2016;5(2):129–135.
  • Bikinesi L, O’Bryan G, Roscoe C, et al. Implementation and evaluation of a Project ECHO telementoring program for the Namibian HIV workforce. Human Resources Health. 2020;18(1):61.
  • Cheallaigh CN, O’Leary A, Keating S, et al. Telementoring with project ECHO: a pilot study in Europe. BMJ Innov. 2017;3(3):144–151.
  • Tiyyagura G, Asnes AG, Leventhal JM, et al. Impact of Project ECHO on community ED providers’ perceptions of child abuse knowledge and access to subspecialists for child abuse and neglect. Acad Pediatr. 2019;19(8):985–987.
  • Carlin L, Zhao J, Dubin R, et al. Project ECHO telementoring intervention for managing chronic pain in primary care: insights from a qualitative study. Pain Med. 2018;19(6):1140–1146.
  • Damian AJ, Robinson S, Manzoor F, et al. A mixed methods evaluation of the feasibility, acceptability, and impact of a pilot project ECHO for community health workers (CHWs). Pilot Feasibil Stud. 2020;6:132.
  • Clandinin DJ, Cave M-T. Creating pedagogical spaces for developing doctor professional identity. Med Educ. 2008;42(8):765–770.
  • White C, Mcllfatrick S, Dunwoody L, et al. Supporting and improving community health services—a prospective evaluation of ECHO technology in community palliative care nursing teams. BMJ Support Palliat Care. 2015;9(2):202–208.
  • Katzman JG, Comerci G, Boyle JF, et al. Innovative telementoring for pain management: project ECHO pain. J Contin Educ Health Prof. 2014;34(1):68–75.
  • Tantillo M, Starr T, Kreipe R. The recruitment and acceptability of a project ECHO eating disorders clinic: a pilot study of telementoring for primary medical and behavioral health care practitioners. Eat Disord. 2020;28(3):230–255.
  • Hassan S, Carlin L, Zhao J, et al. Promoting an interprofessional approach to chronic pain management in primary care using Project ECHO. J Interprof Care. 2020;35(3):464-467.
  • Lalloo C, Osei-Twum J-A, Rapoport A, et al. Pediatric Project ECHO: a virtual community of practice to improve palliative care knowledge and self-efficacy among interprofessional health care providers. J Palliat Med. 2020. DOI:https://doi.org/10.1089/jpm.2020.0496.
  • Schwindt R, Agley J, McNelis AM, et al. Assessing perceptions of interprofessional education and collaboration among graduate health professions students using the Interprofessional Collaborative Competency Attainment Survey (ICCAS). J Interprof Educ Pract. 2017;8:23–27.
  • Agley J, Henderson C, Adams Z, et al. Provider engagement in Indiana’s opioid use disorder ECHO programme: there is a will but not always a way. BMJ Open Qual. 2021;10(1):e001170.