4,197
Views
11
CrossRef citations to date
0
Altmetric
Articles

The critical role of infrastructure and organizational culture in implementing competency-based education and individualized pathways in undergraduate medical education

ORCID Icon, , , ORCID Icon, ORCID Icon & ORCID Icon

Abstract

In 2010, several key works in medical education predicted the changes necessary to train modern physicians to meet current and future challenges in health care, including the standardization of learning outcomes paired with individualized learning processes. The reframing of a medical expert as a flexible, adaptive team member and change agent, effective within a larger system and responsive to the community’s needs, requires a new approach to education: competency-based medical education (CBME). CBME is an outcomes-based developmental approach to ensuring each trainee’s readiness to advance through stages of training and continue to grow in unsupervised practice. Implementation of CBME with fidelity is a complex and challenging endeavor, demanding a fundamental shift in organizational culture and investment in appropriate infrastructure. This paper outlines how member schools of the American Medical Association Accelerating Change in Medical Education Consortium developed and implemented CBME, including common challenges and successes. Critical supporting factors include adoption of the master adaptive learner construct, longitudinal views of learner development, coaching, and a supportive learning environment.

Introduction

Medical education exists to serve the health care needs of patients and populations. However, despite rapid changes in biomedical science, population health, and health care systems, the structure of medical education remained largely uniform and steeped in tradition for over 100 years. Health care in the United States has failed to optimize the nation’s health, with issues such as poor-quality care, medical errors, health care disparities and health inequities, complex and burdensome systems, and escalating costs demonstrating important shortcomings in the health care delivery system (IOM Citation2001). New educational strategies are warranted to ensure physicians are better trained to meet the needs of society.

Sentinel works in medical education in 2010 forecasted the types of changes that would be needed to train modern physicians to meet current and future challenges in health care. The Lancet Commission on Education of Health Professionals for the 21st Century, a global independent and interprofessional committee, published recommendations on Education of Health Professionals for the Twenty-First Century (Frenk et al. Citation2010). This group advocated for backward design (Wiggins and McTighe Citation1998) to align curricula and assessment programs to achieve desired educational outcomes across professions (). Similar recommendations arose from an evidence-based report of medical education in the United States, commissioned by the Carnegie Foundation, which called for the standardization of learning outcomes – clearly articulated in a developmental manner across the educational continuum – coupled with individualized learning processes (Cooke et al. Citation2010).

Figure 1. Competency-based education. Reprinted by permission from Frenk et al. (Citation2010).

Figure 1. Competency-based education. Reprinted by permission from Frenk et al. (Citation2010).

Of the Lancet Commission’s final proposed reforms, the first is:

Adoption of competency-based curricula that are responsive to rapidly changing needs rather than being dominated by static coursework. Competencies should be adapted to local contexts and be determined by national stakeholders while harnessing global knowledge and experiences. Simultaneously, the present gaps should be filled in the range of competencies that are required to deal with 21st-century challenges common to all countries—e.g., the response to global health security threats or the management of increasingly complex health systems. (Frenk et al. Citation2010)

Competency-based medical education (CBME) represents a fundamental shift from focusing on curricular time spent on certain topics or experiences to more explicitly emphasizing the outcomes of training (Harden Citation1999). By defining what knowledge, skills, and attitudes trainees must demonstrate to signify readiness to enter the next stages of training and eventual unsupervised practice and outlining how they will continue to expand their competence while in practice, CBME constitutes an outcomes-based approach to curricular design (Frank et al. Citation2010). The Lancet Commission highlighted three critical components of transformative medical education: (1) cultivation of skills in adaptive expertise and lifelong learning rather than rote memorization of medical knowledge; (2) achievement of a range of competencies for effective practice within interprofessional health care teams; and (3) application of sound educational models adapted to the local context (Frenk et al. Citation2010). Current realities of practice demand a reframing of a medical expert not as a solo ‘hero physician’ but rather as a flexible, adaptive team member and change agent, responsive to the community’s needs and effective within a larger system (Lesser et al. Citation2010). CBME approaches can accommodate such evolving expectations of physicians. Actualization of CBME should ensure trainee readiness for escalating care responsibilities and equip them to engage in lifelong learning throughout their continuing education in practice. Implementation, however, is arduous, as the Lancet Commission acknowledged:

No different than a century ago, educational reform is a long and difficult process that demands leadership and requires changing perspectives, work styles, and good relationships between all stakeholders. (Frenk et al. Citation2010)

In this manuscript, we outline shared lessons learned regarding the critical role of infrastructure and organizational culture in implementing CBME.

Practice points

  • Successful implementation of CBME

    • requires safe learning environments and infrastructures that support growth orientation over performance orientation.

    • must be paired with the promotion of self-regulated learning, which can be supported by coaching programs.

    • requires longitudinal experiences and/or structures to enable longitudinal views of performance that transcend courses.

    • requires robust data management and access to data for students, coaches, and program leaders responsible for decision-making about advancement.

    • requires flexible educational systems that promote individualized pathways and allow time variable advancement designed around a continuum of education.

The American Medical Association Accelerating Change in Medical Education Consortium

The American Medical Association (AMA) recognized a malalignment between physician education and the realities of practice, demonstrated by concerns related to medical error, high costs of care, and prevalence of burnout among providers (Skochelak and Stack Citation2017). To spur needed changes to medical education to better meet the needs of patients and populations, the AMA launched the Accelerating Change in Medical Education initiative in 2013. Through a competitive process, initial grants were awarded to eleven U.S. medical schools, and funding was extended in 2016 to an additional twenty-one U.S. schools. The AMA convened these schools to create the Accelerating Change in Medical Education Consortium, providing an unprecedented opportunity for cross-institutional partnerships to implement and disseminate groundbreaking ideas. The consortium has since grown and extended to include graduate medical education (GME). This article focuses on activities related to CBME in UME during the consortium’s first five years among the original 32 schools; the final paper of this supplement discusses the continuation of these efforts into GME.

One core objective of the initiative was to promote new methods for teaching and assessing key competencies for medical students and fostering methods to create more flexible, individualized learning pathways. Recognizing that CBME is a complex intervention heavily dependent upon context and institutional culture, the AMA acknowledged that local solutions would be necessary to support this objective. There was no expectation for a standardized implementation across consortium sites. In this manuscript, we describe how different schools participating in the AMA’s consortium implemented broad curricular and structural innovations to support CBME and reflect upon the collective lessons learned as members shared successes and struggles in the process.

In 2019, Van Melle and colleagues proposed a core components framework for evaluating the fidelity of CMBE implementation (Van Melle et al., International Competency-Based Medical Education Collaborators Citation2019). The five components entail clearly articulated outcome competencies with a sequenced progression of developmental steps, such as milestones to characterize the graded development of competence. Tailored learning experiences, not only in the classroom but also in the workplace, are paired with competency-focused instruction so that authentic learning takes place with the needed teaching, coaching, and role modeling from more experienced providers. Programmatic assessment is the coordinated approach to gathering and synthesizing learner assessment data, typically coupled with group review and decision making so that learners are assessed using a variety of tools to ensure their achievement of the expected progressive development of competence (van der Vleuten et al. Citation2012; Lockyer et al., ICBME Collaborators Citation2017). In a CBME system that achieves these components, learners can progress at different rates. As such, time becomes a curricular resource rather than the defining structure of medical education (Frank et al. Citation2010; Lucey et al. Citation2018 ). Although this core components framework was not yet published when the consortium schools embarked on their efforts, the components provide a useful lens to review the implementation they accomplished.

Implementing CBME at consortium institutions

Consortium institutions tackled the challenge of implementation with varied approaches, sharing lessons learned along the way. The degree to which core components of CBME have been achieved varies significantly among member institutions. In reviewing the objectives described in the initial grant proposals from the 32 member institutions, 16 specifically mentioned CBME as an outcome for their grant, and 14 mentioned individualized pathways; only one explicitly named programmatic assessment as a goal, and three sought to create time-variable systems. Common programmatic changes actually accomplished across consortium institutions included making competency expectations more explicit and broadening opportunities for competency development via more active learning formats and early meaningful clinical roles, consistent with recommendations of the Lancet Commission. To define outcomes and assess learner progression, consortium schools applied existing and novel frameworks. Many aligned with U.S. national frameworks such as the Accreditation Council for Graduate Medical Education Milestones Project (ACGME Citation2021) and the Association of American Medical Colleges Core Entrustable Professional Activities (EPAs) for Entering Residency (AAMC Citation2021). Many institutions implemented explicit assessments of readiness for internship to strengthen the evidence that each learner attained desired competencies. A few sites were able to accomplish even deeper implementation of CBME, with programmatic competency assessment, data-driven learner portfolios, time-variable progression, and flexible individualized learning plans.

Exemplars in implementing CBME

A few consortium institutions were able to implement CBME for all of their students. These sites undertook a comprehensive overhaul of programming to apply a CBME approach to their entire student population over all the years of training. elaborates how these sites accomplished elements of the core components framework, actualizing the ‘reform position’ (Van Melle et al., International Competency-Based Medical Education Collaborators Citation2019) that focuses on ‘flexible, individually tailored programs that can adapt to variable rates of competence attainment’ (Hodges Citation2010).

Table 1. Implementation of CBME at exemplar sites aligned with the core components framework.

Other consortium schools applied CBME strategies to support special tracks involving limited numbers of students. Both the University of California (UC), Davis, School of Medicine and the Ohio University Heritage College of Osteopathic Medicine offered accelerated pathways into primary care and applied CBME strategies to ensure students attained satisfactory learning outcomes despite a shortened timeline of training. At UC Davis, each student works with a dedicated clinician mentor and coach to translate classroom learning into everyday clinical practice skills. The program uses EPAs to assess the competence and determine appropriate advancement. Ohio Heritage developed a new osteopathic competency-based program in which students must achieve didactic and clinical milestones that are not fixed in a specific timeframe. Advancement is based solely on the attainment of competencies determined by the objective assessment, not by the number of years in the program.

Collaborative work of the consortium to support fidelity in implementation

As consortium institutions delved into the hard work of CBME implementation, teams recognized that attaining fidelity would truly require a transformation. CBME is a complex intervention that is highly context-dependent. The success of a CBME program relies heavily upon infrastructure to ensure that graduates are supported and assessed on their progress toward achievement of expected competencies (Holmboe et al. Citation2010). McGaghie and colleagues cautioned in 1978 that ‘implementation of such a system demands a substantial redefinition of faculty and student roles and responsibilities’ (McGaghie et al. Citation1978). The consortium provided a venue for schools engaged in the work of transformation to discuss shared challenges and barriers encountered along the way. Transparency among members regarding common struggles, coupled with a systems orientation of the consortium’s work, informed institutional approaches to change management that address people, workflows, technology, and culture. Culture change to support CBME entails not only acceptance by faculty and students but also leadership committed to a widespread institutional shift from a focus on achievement to a focus on growth and lifelong learning (Alman et al. Citation2013). Although schools implemented CBME independently, collaborative efforts emerged around key elements of infrastructure that were deemed necessary to support fidelity of implementation.

The following areas reflect the collective efforts of consortium members across institutions to support fidelity in implementation of CBME.

The master adaptive learner model

CBME relies on learners who are actively engaged in their own education. Though gaining admission to medical school requires significant academic success, that success is rarely of a self-directed and self-regulated nature that is required throughout one’s career as a physician (Sandars and Cleary Citation2011). Students must shift from a performance orientation, in which they strive to appear competent and achieve extrinsic rewards such as scores, grades, and medical school admission to a mastery orientation in which they learn for the sake of developing competency and improving to achieve individual goals and optimal function within larger teams and systems (Dweck Citation1986; Pintrich et al. Citation2003). Explaining to accomplished students that they need to learn how to learn is a challenge. This tension led members of the consortium to collaborate in articulating the construct of the Master Adaptive Learner (Cutrer et al. Citation2017). This model pushes beyond the Dreyfus model of routine expertise (Dreyfus et al. Citation1986) to strive for adaptive expertise. Iterative cycles of assessing, adapting, planning, and learning support individualized, developmental competency progression and illustrate more clearly how such development may vary within an individual across different domains of competency. These steps are analogous to efforts in health system quality improvement, reminiscent of the ‘plan, do, study, act’ approach (Deming Citation1986). This parallel helps learners to embrace a growth orientation within a system that is also growing and adapting, presenting continual individual self-improvement as matching continual quality improvement of systems. To further advance the implementation of the Master Adaptive Learner model, members of the consortium collaborated to publish an instructor focused guide on training future clinicians to develop adaptive skills (Cutrer et al. Citation2019).

Coaching

The expectation in CBME for tailored learning experiences requires a structured process to gather and review assessment evidence, understand gaps, and identify needed experiences. Though the focus of coaching in medicine can vary (Lovell Citation2018), a coach in a CBME program guides a learner in the context of a longitudinal relationship through the process of reviewing performance ratings, creating learning goals, planning strategies to achieve goals, and reflecting on personal and professional development (Deiorio et al. Citation2016). Evidence that people in general, and perhaps physicians in particular, are not effective at self-assessment argues that explicit training and evidence about one’s performance to promote informed self-assessment are necessary to support a developmental approach to competency (Davis et al. Citation2006; Sargeant et al. Citation2010). Schools prioritizing the cultivation of a mastery orientation toward learning and adaptive expertise have created coaching programs of varying forms to foster students’ skills in evidence-driven self-assessment that in turn inform the design of individualized learning pathways. As expected from existing literature (Kruger and Dunning Citation1999), high performing students often under-rated their own performance and lower performing students sometimes exhibited over-confidence. Coaches can assist in calibration; one consortium member school demonstrated that repeated sessions involving the student and coach discussing the interpretation of performance feedback led to increasing concordance between students’ self-assessments and coaches’ review of performance. To advance the utilization of coaching in medical education, consortium members collaborated to create a faculty guidebook and a companion guide for learners (Deiorio and Hammoud Citation2017; Wolff et al. Citation2019). The consortium has also offered multiple faculty development workshops in coaching.

The learning environment

Consortium members recognized the tremendous impact of the learning environment on the implementation of CBME. A Learning Environment Study across consortium institutions (Skochelak et al. Citation2016) subsequently led to the description of individual and institutional drivers of the medical school educational environment (Gruppen and Stansfield Citation2016). That study demonstrated differences among institutions in student perceptions of the learning environment, but even more striking was the significant individual variance in experience within a given institution, which highlights the need to be mindful of the diversity of learners in addressing environmental issues. A conceptual framework for describing the learning environment outlined by Gruppen and colleagues includes both a psychosocial dimension and a material dimension. Members of the consortium shared strategies to address the interplay of the personal, social, physical, and virtual spaces and organizational factors in supporting optimal learning (Gruppen et al. Citation2019).

CBME demands a true developmental approach, which creates a level of vulnerability for learners that must be acknowledged. Shifting from a performance orientation to a mastery orientation requires the learner to expose areas needing further development (Sawatsky et al. Citation2020). Although uncomfortable for many, this same behavior is necessary in the clinical realm to support safe and effective care. Schools attaining deeper implementation of CBME articulated this rationale to students and focused on empowering them to take charge of their own development.

To move from traditional assessment of learning to emphasize assessment for learning (van der Vleuten et al. Citation2012), the institutional culture must gain students’ trust. Explicit training for students about programmatic assessment approaches and the longitudinal view of development proved essential. Role modeling was helpful as well; students responded best when supervising residents and faculty members openly reflected on their own gaps and learning needs. Edmondson demonstrates that learning is most effective when a student experiences a combination of psychological safety, motivation, and accountability (Edmondson Citation2018). To realize the vision of CBME, students must perceive a safe environment (Tsuei et al. Citation2019).

The issue of a safe learning environment has proved critical in faculty development as well. Faculty assessors found a criterion-based approach challenging since they have historically compared students at a given level of training for the purpose of grade assignments and ranking rather than reporting on their progress toward ultimate competency for graduation. In reality, that traditional normative approach has been found fraught with structural biases and creates educational inequities with significant downstream consequences (Hauer and Lucey Citation2019; Teherani Citation2018). Many assessors expressed concern that their ratings of student performance were ‘lower’ in the context of CBME — even if developmentally appropriate — and that this would harm students’ ability to secure a residency position in their desired clinical discipline. Some faculty found the educational lingo around competencies, milestones, and EPAs overly complex (Dath and Iobst Citation2010; Holmboe et al. Citation2011). One approach to faculty development was to tier faculty training by the ‘need to know’. Demonstrating how data dashboards aid in formulating a longitudinal view of each student’s performance and explaining that competency committees monitor trends in performance over time and across settings helped frontline assessors (those faculty members and residents supervising learners in the clinical environment) understand that any single rating from them would not harm a student. Coaches assisting learners to engage in a developmental mindset and faculty members serving on competency committees needed a deeper, more nuanced understanding of the institution’s programmatic assessment process.

Monitoring longitudinal progress

Investment in educational informatics is critical to support programmatic assessment (Thoma et al. Citation2020). Competency develops across courses and over time; thus, centralized mechanisms to track progress are required. Data capture, organization, and visualization tools are necessary to make performance evidence interpretable and actionable (Boscardin et al. Citation2018). Consortium members shared key features of developing informatics platforms and challenges encountered (Santen et al. Citation2020). Although competency milestones rely on narrative descriptors, a common pitfall of dashboards noted in implementation was conversion of narratives to numeric data for reporting and display. This practice is harmful in two ways. Numeric data provides no feedback to help a student move, for example, from level 2 to level 3 on a scale; one needs the wording of competencies and milestones to understand what specific behaviors lead to growth. Numeric representation also creates the illusion that criterion-based ratings represent continuous data, although in reality, they are discrete descriptors. The frequency with which, and importantly the contexts in which, a student receives one rating or another is more helpful to educational planning than an ‘average’ score. Most programs went through repeated iterations of their dashboard designs to best support meaningful interpretation. Further, some schools incorporated narratives associated with student performance in clerkships to ground decisions on student progression and competency achievement. Members of the consortium collaborated to articulate critical technology needs in support of transformation (Stuart and Triola Citation2015; Spickard et al. Citation2016).

Educational handovers between courses and phases in an undergraduate medical education (UME) curriculum were deemed by consortium schools as necessary to support a developmental, sequenced learning progression. In a historical ‘performance mindset’ (Dweck Citation1986), some institutions have discouraged communication across courses due to concerns about anticipatory bias regarding a student’s performance, and many students would prefer to enter each course with a clean slate. However, a mastery orientation relies upon trends in individual performance across courses and settings to identify developmental needs enabling a more purposeful use of time as a resource for learning. Consortium schools that focused on longitudinal development found that individualized learning plans could often be executed within any given course or rotation in which the student was already engaged by focusing attention to specific competencies. In some cases, students could be directed toward targeted experiences as appropriate. Supporting a continuum of learner development across the transition of UME to GME requires similar support and trust. Several consortium members participated in pilots of communication of competency development near the time of graduation (Schiller et al. Citation2018). Consortium member Oregon Health & Science University School of Medicine (OHSU) implemented a process of communication after students have been matched that provides the receiving GME program an update on student competency achievement prior to the start of residency. One of the consortium’s annual conferences devoted several sessions to this topic, resulting in a collaborative publication regarding key elements of an envisioned post-selection UME-GME handover process to engage learners in a continuum of growth (Morgan et al. Citation2020).

Bringing it all together

Members of the consortium collaborated to describe the intersection of CBME, master adaptive learning, coaching, and the learning environment, as represented in . Centered around iterative cycles of master adaptive learning, CBME provides guidance regarding desired learning outcomes and generates evidence of progress over time. Coaches support the learner’s advancement through each cycle and provide social support that positions the learner for increasing self-direction. The learning environment must offer appropriate physical and virtual spaces and tools for mastery learning and must validate mastery orientation at an organizational level.

Figure 2. The integral relationships between competency-based medical education, the Master Adaptive Learner model, coaching practices, and the learning environment. Competency based medical education (left) provides clarity of competency expectations and evidence of performance. The Master Adaptive Learner model (center circle) provides a repeating, cyclical process for learners to reflect on performance and plan learning. Coaches (outer circle) wrap around the Master Adaptive Learner, leveraging questioning, accountability, and other strategies to guide them. The entire process relies upon a supportive learning environment (right), with appropriate technology to track and visualize data, as well as a social and organizational culture that encourages a mastery orientation. Used with permission of the American Medical Association.

Figure 2. The integral relationships between competency-based medical education, the Master Adaptive Learner model, coaching practices, and the learning environment. Competency based medical education (left) provides clarity of competency expectations and evidence of performance. The Master Adaptive Learner model (center circle) provides a repeating, cyclical process for learners to reflect on performance and plan learning. Coaches (outer circle) wrap around the Master Adaptive Learner, leveraging questioning, accountability, and other strategies to guide them. The entire process relies upon a supportive learning environment (right), with appropriate technology to track and visualize data, as well as a social and organizational culture that encourages a mastery orientation. Used with permission of the American Medical Association.

This integration of models is now being promoted in the consortium’s faculty development programs on coaching.

Prioritizing time for competency development

Because CBME treats time as a resource for learning rather than a measure of learning and development progresses in a time-variable manner, creating flexibility in timelines is a critical element to support fidelity of implementation. The traditional Flexnerian model of medical education in the U.S., with two years of basic science training followed by two years of clinical work, has a rigidity that artificially separates development in these two spheres. Incorporating active learning modalities and clinical opportunities starting in the first year of the medical school enables students to begin developing the full breadth of competency domains much earlier. It is important to make this connection clear to students – that their knowledge learning is for application with patients and their behavior in their educational teams is a precursor to their behavior on clinical teams; the classroom is a safe space to practice all the domains. Successful CBME programs incorporate meaningful clinical roles from the first year to better integrate students’ learning across competencies.

An external barrier in the U.S. that essentially robs students of time for development has been a heavy emphasis on the scores on the United States Medical Licensing Examination (USMLE) Step 1 examination (which aims to assess whether medical school students or graduates can apply important concepts of the foundational sciences fundamental to the practice of medicine) in selecting students for residency positions. Use of this metric has driven students to de-prioritize other domains of competency until that exam has been taken. It is likely that high-stakes exams in other countries have similar impacts. Students experience anxiety when changes in curriculum and assessment are perceived to create a risk to performance on licensing examinations or competitiveness for residency selection (Yengo-Kahn et al. Citation2017). Some consortium schools piloted a shift in timing, encouraging students to take this examination after completing their core clerkships to emphasize the integrated nature of developing knowledge and clinical skill sets; students at these schools attained higher scores (Daniel et al. Citation2017; Jurich et al. Citation2019). A consortium conference in 2018, which included representatives of the USMLE, highlighted concerns about the impact of Step 1 scoring on competency development. This helped to spur InCUS, a stakeholder conference in 2019 (USMLE Citation2019), jointly sponsored by the AMA, the Association of American Medical Colleges (AAMC), the Educational Commission for Foreign Medical Graduates (ECFMG), and the USMLE parent organizations, the Federation of State Medical Boards and the National Board of Medical Examiners (USMLE Citation2019). There was consensus among participants that use of scores on this examination as a screening metric is only one of many problems associated with the current process of selection for residency in the U.S., however, this did result in plans to transition reporting from a 3-digit score to a pass/fail model starting in 2022. Hopefully, that transition will effectively recapture time for development across all competency domains early in students’ experiences.

Implementing time variable CBME in the United States faces another significant barrier related to the existing competitive process of selection for residency positions. Although processes of transition differ in other countries, similar pressures likely influence student behavior everywhere. Students feel pressured to assemble a competitive resume, which may paradoxically hamper development as learners feel compelled to prioritize looking good on paper over revealing their learning needs toward becoming a better physician. The competitive process for selection disrupts 3–6 months of educational time for each student, between completing multiple away ‘audition’ rotations in one’s desired specialty and devoting several months to interviews. Numerous concerns about this transition point were articulated during the InCUS conference, in which there was consensus that the current process is not serving any stakeholders well. The Coalition for Physician Accountability has recently charged a UME-GME Review Committee to examine the deleterious efforts of the current residency application and selection process (CPA Citation2021). It is notable that accelerated 3-year tracks typically seek exception from the formal U.S. matching process and guarantee participating students residency positions at the home institution; by avoiding time lost to the selection process, training time in these accelerated pathways effectively becomes only a few months less than the traditional pathway.

One strategy used by many consortium schools was to reduce the length of the pre-clerkship phase and position the core clerkships earlier in the students’ experience. Rather than completing core clerkships in June of the third year then hustling to complete a couple of advanced clinical rotations before submitting residency applications in September, students at these schools gain an additional 4–6 months for career exploration after the core clerkships and can focus on rounding out competency development before the application cycle begins. This structure provides more flexibility in the post-clerkship phase for individualized pathways driven by performance evidence and individual interests.

The single time point of transition into GME is an added challenge. On reflection, it does seem odd to have a system in which over 30,000 trainees transition simultaneously across the country; perhaps with better data, multiple standardized time points for transition could be implemented. Consortium schools that implemented robust CBME programs were able to identify a subset of students who were ready for the duties of internship at earlier time points in training. One school in the consortium, OHSU, did graduate these students early, reducing their tuition burden and even allowing some graduates to begin GME training within their own institution ahead of schedule (Mejicano and Bumsted Citation2018). Other schools did not graduate these candidates early because there was nowhere for them to go, and months of inactivity did not seem beneficial to sustaining performance. Loss of financial aid and loan repayment deferral for graduated students is also a risk. Those schools tried to provide advanced, value-added experiences within the home institution to promote ongoing growth. The upcoming work of the Coalition for Physician Accountability to explore changes to the UME to GME transition offers the opportunity to address challenges with the current selection process that would better support the continued learning trajectory valued in a CBME model. The AMA has taken on an advocacy role, encouraging systemic changes – such as multiple time points of transition across institutions from medical school into GME and from GME into fellowship or practice – to create a structure that supports full fidelity of implementation of CBME (Nousiainen et al. Citation2020).

Conclusion

Implementing CBME in UME is an immense and ongoing change process for individuals, systems, and cultures. The AMA consortium schools’ experiences highlight how each institution will encounter different challenges and opportunities in implementation, based on institutional mission, characteristics, readiness for change, and resources. Yet, there is benefit to collaboration around shared challenges and opportunities. Members of the consortium benefitted from a commitment to transparency in sharing struggles, lessons learned, and resources. Concrete examples of implementation strategies at specific institutions – some described in this manuscript – may help other schools considering implementation of CBME in UME. Rather than be intimidated and perhaps even paralyzed by the significant investments in faculty effort, informatics infrastructure, and culture change needed to implement CBME, some institutions might embrace an iterative approach while striving for transformative change (Borkan et al. Citation2018).

Maintaining transparency about the fidelity of implementation is necessary to understand outcomes; we cannot dismiss the conceptual value of CBME due to our challenges in bringing the construct to reality. Perceived immediate benefits of CBME based on the experience of consortium member institutions include the ability to identify and intervene with learners requiring more support as well as recognizing and enabling those ready to advance to greater responsibility for patient care. Ongoing improvements will include strengthening assessment processes and supporting a true continuum of development from UME to GME via the collective development of competency assessment frameworks and data management tools that are comparable and translatable across institutions. Institutions must continue to evaluate outcomes regarding the downstream performance of learners. Continued examination of the culture of medicine and advocacy for infrastructures and environments that truly support professional development across one’s career will be essential to realizing the full potential of CBME.

Acknowledgments

The authors would like to thank their home institutions, fellow members of the American Medical Association’s (AMA) Accelerating Change in Medical Education Consortium, and the AMA for their support of innovation.

The AMA Supplement is sponsored and supported by The American Medical Association.

Disclosure statement

This supplement is wholly funded by the American Medical Association. All authors are either employed by the American Medical Association or affiliated with an institution that is a member of the American Medical Association Accelerating Change in Medical Education Consortium.

Disclaimer

The opinions expressed in this article are those of the authors and do not necessarily reflect American Medical Association policy.

Additional information

Funding

This work was funded in part by the American Medical Association.

Notes on contributors

Kimberly D. Lomis

Kimberly D. Lomis, MD, Medical Education Outcomes American Medical Association, Chicago, IL, USA.

George C. Mejicano

George C. Mejicano, MD, MS, School of Medicine Oregon Health and Science University, Portland, OR, USA.

Kelly J. Caverzagie

Kelly J. Caverzagie, MD, College of Medicine University of Nebraska Omaha, NE, USA.

Seetha U. Monrad

Seetha U. Monrad, MD, Medical School University of Michigan Ann Arbor, MI, USA.

Martin Pusic

Martin Pusic, PhD, MD, Department of Pediatrics Harvard Medical School Boston, MA, USA.

Karen E. Hauer

Karen E. Hauer, MD, PhD, School of Medicine University of California, San Francisco, San Francisco, CA, USA.

References

  • [AAMC] Association of American Medical Colleges. 2021. The core entrustable professional activities (EPAs) for entering residency. Washington (DC): Association of American Medical Colleges; [accessed April 16]. https://www.aamc.org/what-we-do/mission-areas/medical-education/cbme/core-epas.
  • [ACGME] Accreditation Council for Graduate Medical Education. 2021. Milestones. Chicago (IL): Accreditation Council for Graduate Medical Education; [accessed 2021 April 16]. https://www.acgme.org/What-We-Do/Accreditation/Milestones/Overview.
  • Alman BA, Ferguson P, Kraemer W, Nousiainen MT, Reznick RK. 2013. Competency-based education: a new model for teaching orthopaedics. Instr Course Lect. 62:565–569.
  • Borkan JM, George P, Tunkel AR. 2018. Curricular transformation: the case against global change. Acad Med. 93(10):1428–1430.
  • Boscardin C, Fergus KB, Hellevig B, Hauer KE. 2018. Twelve tips to promote successful development of a learner performance dashboard within a medical education program. Med Teach. 40(8):855–861.
  • Cooke M, Irby DM, O’Brien BC. 2010. Educating physicians: a call for reform of medical school and residency. San Francisco, CA: Jossey-Bass.
  • [CPA] Coalition for Physician Accountability. 2021. Reviewing the transition from UME to GME. Coalition for Physician Accountability; [accessed 2021 April 16]. https://physicianaccountability.org/ume-gme/.
  • Cutrer WB, Miller B, Pusic MV, Mejicano G, Mangrulkar RS, Gruppen LD, Hawkins RE, Skochelak SE, Moore DE. Jr. 2017. Fostering the development of master adaptive learners: a conceptual model to guide skill acquisition in medical education. Acad Med. 92(1):70–75.
  • Cutrer WB, Pusic MV, Gruppen L, Hammoud M, Santen S. 2019. The master adaptive learner. Philadelphia (PA): Elsevier.
  • Daniel M, Fleming A, Grochowski CO, Harnik V, Klimstra S, Morrison G, Pock A, Schwartz ML, Santen S. 2017. Why not wait? Eight institutions share their experiences moving United States Medical Licensing Examination Step 1 after core clinical clerkships. Acad Med. 92(11):1515–1524.
  • Dath D, Iobst W. 2010. The importance of faculty development in the transition to competency-based medical education. Med Teach. 32(8):683–686.
  • Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. 2006. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 296(9):1094–1102.
  • Deiorio NM, Hammoud MM. 2017. Coaching in Medical Education, a faculty handbook. Chicago (IL): American Medical Association; [accessed 2021 April 16]. https://www.ama-assn.org/system/files/2019-09/coaching-medical-education-faculty-handbook.pdf.
  • Deiorio NM, Carney PA, Kahl LE, Bonura EM, Juve AM. 2016. Coaching: a new model for academic and career achievement. Med Educ Online. 21:10.3402/meo.v21.33480. [accessed 2021 April 16] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5136126/.
  • Deming WE. 1986. Out of the crisis. Cambridge (MA): Massachusetts Institute of Technology Center for Advanced Engineering Study.
  • Dreyfus HL, Dreyfus SE, Athanasiou T. 1986. Mind over machine: the power of human intuition and expertise in the era of the computer. New York (NY): Free Press.
  • Dweck CS. 1986. Motivational processes affecting learning. American Psychologist. 41(10):1040–1048.
  • Edmondson A. 2018. The fearless organization: creating psychological safety in the workplace for learning, innovation and growth. Hoboken (NJ): John Wiley & Sons.
  • Frank JR, Snell LS, Ten Cate O, Holmboe ES, Carraccio C, Swing SR, Harris P, Glasgow NJ, Campbell C, Dath D, et al. 2010. Competency-based medical education: theory to practice. Med Teach. 32(8):638–645.
  • Frenk J, Chen L, Bhutta ZA, Cohen J, Crisp N, Evans T, Fineberg H, Garcia P, Ke Y, Kelley P, et al. 2010. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 376(9756):1923–1958.
  • Gruppen LD, Irby DM, Durning SJ, Maggio LA. 2019. Conceptualizing learning environments in the health professions. Acad Med. 94(7):969–974.
  • Gruppen LD, Stansfield RB. 2016. Individual and institutional components of the medical school educational environment. Acad Med. 91(11 Association of American Medical Colleges Learn Serve Lead: Proceedings of the 55th Annual Research in Medical Education Sessions):S53–S57.
  • Harden RM. 1999. AMEE Guide No. 14: outcome-based education: part 1-an introduction to outcome-based education. Med Teach. 21(1):7–14.
  • Hauer KE, Lucey CR. 2019. Core clerkship grading: the illusion of objectivity. Acad Med. 94(4):469–472.
  • Hauer KE, O'Sullivan PS, Fitzhenry K, Boscardin C. 2018. Translating theory into practice: implementing a program of assessment. Acad Med. 93(3):444–450.
  • Hodges BD. 2010. A tea-steeping or i-Doc model for medical education? Acad Med. 85(9 Suppl):S34–S44.
  • Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. 2010. The role of assessment in competency-based medical education. Med Teach. 32(8):676–682.
  • Holmboe ES, Ward DS, Reznick RK, Katsufrakis PJ, Leslie KM, Patel VL, Ray DD, Nelson EA. 2011. Faculty development in assessment: the missing link in competency-based medical education. Acad Med. 86(4):460–467.
  • [IOM] Institute of Medicine: Committee on Quality of Health Care in America. 2001. Crossing the quality chasm: a new health system for the 21st century. Washington (DC): National Academies Press.
  • Jurich D, Daniel M, Paniagua M, Fleming A, Harnik V, Pock A, Swan-Sein A, Barone MA, Santen SA. 2019. Moving the United States Medical Licensing Examination Step 1 after core clerkships: an outcomes analysis. Acad Med. 94(3):371–377.
  • Keeley MG, Gusic ME, Morgan HK, Aagaard EM, Santen SA. 2019. Moving toward summative competency assessment to individualize the postclerkship phase. Acad Med. 94(12):1858–1864.
  • Kruger J, Dunning J. 1999. Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J of Pers Soc Psychol. 77(6):1121–1134.
  • Lesser CS, Lucey CR, Egener B, Braddock CH, 3rd, Linas SL, Levinson W. 2010. A behavioral and systems view of professionalism. JAMA. 304(24):2732–2737.
  • Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, Holmboe ES, Frank JR, ICBME Collaborators. 2017. Core principles of assessment in competency-based medical education. Med Teach. 39(6):609–616.
  • Lomis KD, Russell RG, Davidson MA, Fleming AE, Pettepher CC, Cutrer WB, Fleming GM, Miller BM. 2017. Competency milestones for medical students: Design, implementation, and analysis at one medical school. Med Teach. 39(5):494–504.
  • Lovell B. 2018. What do we know about coaching in medical education? A literature review. Med Educ. 52(4):376–390.
  • Lucey CR, Thibault GE, Ten Cate O. 2018. Competency-based, time-variable education in the health professions: crossroads. Acad Med. 93(3S Competency-Based, Time-Variable Education in the Health Professions):S1–S5.
  • McGaghie WC, Sajid AW, Miller GE, Telder TV, Lipson L. 1978. Competency-based curriculum development in medical education: an introduction. World Health Organization: Geneva, Switzerland. [accessed 2021 April 16]. https://apps.who.int/iris/handle/10665/39703.
  • Mejicano GC, Bumsted TN. 2018. Describing the journey and lessons learned implementing a competency-based, time-variable undergraduate medical education curriculum. Acad Med. 93(3S Competency-Based, Time-Variable Education in the Health Professions):S42–S48.
  • Monrad SU, Mangrulkar RS, Woolliscroft JO, Daniel MM, Hartley SE, Gay TL, Highet A, Vijayakumar N, Santen SA. 2019. Competency committees in undergraduate medical education: approaching tensions Using a Polarity Management Framework. Acad Med. 94(12):1865–1872.
  • Morgan HK, Mejicano GC, Skochelak S, Lomis K, Hawkins R, Tunkel AR, Nelson EA, Henderson D, Shelgikar AV, Santen SA. 2020. A responsible educational handover: improving communication to improve learning. Acad Med. 95(2):194–199.
  • Nousiainen M, Scheele F, Hamstra SJ, Caverzagie K. 2020. What can regulatory bodies do to help implement competency-based medical education? Med Teach. 42(12):1369–1373.
  • Pintrich PR, Conley AM, Kempler TM. 2003. Current issues in achievement goal theory and research. Int J Educ Res. 39(4-5):319–337.
  • Sandars J, Cleary TJ. 2011. Self-regulation theory: applications to medical education: AMEE Guide No. 58. Med Teach. 33(11):875–886.
  • Santen SA, Myklebust L, Cabrera C, Patton J, Grichanik M, Bibler Zaidi NL. 2020. Creating a learner performance dashboard for programmatic assessment. Clin Teach. 17(3):261–266.
  • Sargeant J, Armson H, Chesluk B, Dornan T, Eva K, Holmboe E, Lockyer J, Loney E, Mann K, van der Vleuten C. 2010. The processes and dimensions of informed self-assessment: a conceptual model. Acad Med. 85(7):1212–1220.
  • Sawatsky AP, Huffman BM, Hafferty FW. 2020. Coaching versus competency to facilitate professional identity formation. Acad Med. 95(10):1511–1514.
  • Schiller JH, Burrows HL, Fleming AE, Keeley MG, Wozniak L, Santen SA. 2018. Responsible milestone-based educational handover with individualized learning plan from undergraduate to graduate pediatric medical education. Acad Pediatr. 18(2):231–233.
  • Skochelak SE, Stack SJ. 2017. Creating the medical schools of the future. Acad Med. 92(1):16–19.
  • Skochelak SE, Stansfield RB, Dunham L, Dekhtyar M, Gruppen LD, Christianson C, Filstead W, Quirk M. 2016. Medical student perceptions of the learning environment at the end of the first Year: a 28-medical school collaborative. Acad Med. 91(9):1257–1262.
  • Spickard A, III, Ahmed T, Lomis K, Johnson K, Miller BM. 2016. Changing medical school IT to support medical education transformation. Teach Learn Med. 28(1):80–87.
  • Stuart G, Triola M. 2015. Educational technologies in health professions education: current state and future directions. Josiah Macy Jr Foundation Conference on Enhancing Health Professions Education through Technology. April, 2015, Arlington, Vriginia; p. 71–111.
  • Teherani A, Hauer KE, Fernandez A, King TE, Jr, Lucey C. 2018. How small differences in assessed clinical performance amplify to large differences in grades and awards: a cascade with serious consequences for students underrepresented in medicine. Acad Med. 93(9):1286–1292.
  • Thoma B, Warm E, Hamstra SJ, Cavalcanti R, Pusic M, Shaw T, Verma A, Frank JR, Hauer KE. 2020. Next steps in the implementation of learning analytics in medical education: consensus from an international cohort of medical educators. J Grad Med Educ. 12(3):303–311.
  • Tsuei SH, Lee D, Ho C, Regehr G, Nimmon L. Exploring the construct of psychological safety in medical education. Acad Med. 2019. 94(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 58th Annual Research in Medical Education Sessions):S28–S35.
  • [USMLE] United States Medical Licensing Examination. 2019. Summary report and preliminary recommendations from the Invitational Conference on USMLE Scoring (InCUS), March 11-12, 2019. Philadelphia (PA): United States Medical Licensing Examination®; [accessed 2021 April 16]. https://www.usmle.org/pdfs/incus/InCUS_summary_report.pdf.
  • Van der Vleuten CP, Schuwirth LW, Driessen EW, Dijkstra J, Tigelaar D, Baartman LKJ, van Tartwijk J. 2012. A model for programmatic assessment fit for purpose. Med Teach. 34(3):205–214.
  • Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J, International Competency-Based Medical Education Collaborators. 2019. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 94(7):1002–1009.
  • Wiggins G, McTighe J. 1998. Understanding by design. Alexandria (VA): Association for Supervision & Curriculum Development.
  • Wolff M, Jackson J, Hammoud MM. 2019. It takes two: a guide to being a good coachee. Chicago (IL): American Medical Association. [accessed 2021 April 16]. https://www.ama-assn.org/system/files/2019-09/coaching-medical-education-learner-handbook.pdf.
  • Yengo-Kahn AM, Baker CE, Lomis K. 2017. Medical students' perspectives on implementing curriculum change at one institution. Acad Med. 92(4):455–461.