274
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Assessing readiness: the impact of an experiential learning entrustable professional activity-based residency preparatory course

ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Article: 2352217 | Received 08 Jan 2024, Accepted 02 May 2024, Published online: 17 May 2024

ABSTRACT

As medical schools move to integrate the Core Entrustable Professional Activities for Entering Residency (EPAs) into curricula and address the transition from student to resident, residency preparatory courses have become more prevalent. The authors developed an experiential learning EPA-based capstone course for assessment to determine impact on learner self-assessed ratings of readiness for residency and acquisition of medical knowledge. All fourth-year students from the classes of 2018–2020 completed a required course in the spring for assessment of multiple EPAs, including managing core complaints, performing basic procedures, obtaining informed consent, and providing patient handoffs. Learners selected between three specialty-based parallel tracks – adult medicine, surgery, or pediatrics. Students completed a retrospective pre-post questionnaire to provide self-assessed ratings of residency preparedness and comfort in performing EPAs. Finally, the authors studied the impact of the course on knowledge acquisition by comparing student performance in the adult medicine track on multiple choice pre- and post-tests. Four hundred and eighty-one students were eligible for the study and 452 (94%) completed the questionnaire. For all three tracks, there was a statistically significant change in learner self-assessed ratings of preparedness for residency from pre- to post-course (moderately or very prepared: adult medicine 61.4% to 88.6% [p-value < 0.001]; surgery 56.8% to 81.1% [p-value < 0.001]; pediatrics 32.6% to 83.7% [p-value 0.02]). A similar change was noted in all tracks in learner self-assessed ratings of comfort from pre- to post-course for all studied EPAs. Of the 203 students who participated in the adult medicine track from 2019–2020, 200 (99%) completed both the pre- and post-test knowledge assessments. The mean performance improved from 65.0% to 77.5% (p-value < 0.001). An experiential capstone course for the assessment of EPAs can be effective to improve learner self-assessed ratings of readiness for residency training and acquisition of medical knowledge.

Introduction

The transition from medical student to resident is a significant one in the development of a physician. The trainee’s responsibility will increase literally (in volume) and figuratively, as first-day interns will have the authority to make decisions that could have lasting consequences for patients. Given the high stakes involved in this transition, efforts have been made to address the skills and level of competency that graduating medical students should possess. In 2014, the American Association of Medical Colleges (AAMC) developed the Core Entrustable Professional Activities for Entering Residency (EPAs) to provide further guidance to medical schools for the preparation of medical students for post-graduate training [Citation1]. In 2021, the AAMC published results of its 10 school pilot study focusing on the feasibility, utility, and accuracy of incorporating EPAs in medical schools’ curricula. For all of the EPAs, the percentage of graduating students adjudged to be entrustable fell well short of 100% [Citation2]. These findings clearly demonstrate the need for medical schools to do a better job with teaching these skills. However, there is a relative dearth of formal guidance on the optimal way to address the integration and assessment of these skills in Undergraduate Medical Education (UME) curricula.

To address the AAMC’s call to implement specific skills into UME, we created a required capstone course for senior medical students specifically to assess proficiency in the performance of 8 of the EPAs. Upon reviewing the literature, we found a number of medical schools across the nation that are piloting similar residency preparatory courses (i.e., bootcamps), but the vast majority do not focus on assessment, and only a few have reported outcomes that might relate to preparedness for starting residency training [Citation3–6]. Based on the Kirkpatrick model of evaluation [Citation7], these bootcamps have demonstrated improvements in short-term Level 1 (reaction) and Level 2 (learning) outcomes. For example, one school implemented a bootcamp as an optional course composed mainly of didactic review, and reported longer-term, positive Kirkpatrick Level 1 outcomes upon surveying former students three months into their PGY-1 year [Citation8]. Because of the inherent difficulty with tracking student performance of specific skills into residency training, providing evidence that these resource-intensive bootcamps lead to meaningful change with Kirkpatrick Level 3 (behavior) or 4 (results) endpoints is challenging. Additionally, the existing literature describing residency preparatory courses does not address whether students arrive at these courses after completing a 3-plus year curriculum proficient in the skills that are being reviewed or taught.

This study was designed to investigate two major questions: 1) Does an experiential learning EPAs-based course for assessment improve learner self-assessed ratings of readiness for day-to-day resident activities at course completion and after completing several months of internship? 2) Does a simulation-based assessment of common core clinical conditions improve performance on assessment of medical knowledge at course completion?

Here, we describe the creation and implementation of an experiential learning EPA-based capstone course for assessment, results for short- and longer-term Kirkpatrick Level 1 and 2 educational outcomes, and lessons learned from the continued iterative development of a required EPA-based UME capstone. This study is unique in that the capstone course is solely for assessment, utilizing the preceding 3-plus year curriculum for core instruction and preparation of the students in the performance of these core clinical activities.

Materials and methods

Description and development of curriculum

We introduced a required residency preparation bootcamp course, entitled Assessment for Internship, starting with the graduating class of 2017. The course content, pedagogy and assessment methods were created by the planning committee, which consisted of faculty educators from Internal Medicine, Pediatrics, and General Surgery, the education director and operations manager of the Simulation Center, and the operations manager of the Standardized Patient program. The planning committee was supported by a course coordinator.

We focused the initial discussion of course content on a review of the 13 AAMC EPAs with further exploration of additional important activities for consideration [Citation1]. Ultimately, the planning committee decided on addressing a limited number of core EPAs based on a judgment of importance and the resources necessary to allow for direct observation and assessment of an entire medical school class (about 175 students). While the original concept of this capstone course was for summative assessment, the planning committee quickly agreed that until assessment instruments were validated, all of the sessions would be used primarily for formative assessment.

The EPAs included in the course were the initial triage and management of core complaints and diagnoses, collaboration with the interprofessional team, performance of basic procedures, obtaining of informed consent, and provision of patient handoffs. Because of the notable differences among various specialties in the relevant conditions and skills performed, the committee made the decision to create curricula for three separate tracks: 1) Adult Medicine; 2) Surgery; 3) Pediatrics. Senior students were required to select their track roughly three months prior to the course start date to allow for more specific course planning.

We created the sessions addressing initial triage and management of core complaints and diagnoses using a standard process for all three tracks. The assessment methods utilized were high-fidelity simulation and components of the Mock Pages curriculum originally developed at Southern Illinois University (SIU) School of Medicine [Citation9]. First, we defined a core group of conditions by track using a modified delphi method [Citation10], which included multiple faculty with formal program leadership roles in residency programs from relevant specialties. Second, we created critical action item checklists for the management and treatment of each of the core diseases (Appendix A). These checklists were also developed using a modified delphi method including multiple faculty experts from relevant specialties. Third, we created simulation scenarios for each of the core conditions. The critical action item checklists were then used during the simulation scenarios and completed by the faculty instructors playing the role of nursing confederates. Each simulation case was immediately followed by a debriefing session during which formative feedback was provided using the critical action checklists as well as any additional feedback at the discretion of the instructor. For the Mock Pages session, we adapted 2–5 cases from the original SIU curriculum for each of the three tracks. Using these cases, the students were assessed on medical decision-making and interprofessional communication skills by faculty instructors calling in as the bedside nurse, with formative feedback given immediately after each case.

We created the sessions addressing performance of basic procedures using a similar framework. We determined a list of core procedures for assessment by track based on consensus from the planning committee. Thereafter, we defined procedural checklists either by adapting existing checklists described in the literature or via a modified delphi method [Citation11–14]. Students were provided asynchronous learning materials related to each procedure prior to their in-person sessions. The procedures selected by track are as follows: Adult Medicine – lumbar puncture, arterial blood gas, bag-valve mask ventilation, and Cardio-Pulmonary Resuscitation (CPR); Surgery – ultrasound-guided central venous catheter placement, arterial blood gas, bag-valve mask ventilation, and CPR; Pediatric – lumbar puncture, neonatal bag-valve mask ventilation, and neonatal CPR. The checklists were completed by faculty instructors while observing students one-on-one performing the procedural skills on task trainers and manikins. After students performed each procedure, they received immediate feedback from the observing faculty while reviewing the checklist together.

We developed the session on obtaining informed consent in collaboration with the Standardized Patient program as an Objective Structured Clinical Examination (OSCE). We determined the case selection via consensus discussion, with the committee focusing on blood product transfusion because of the prevalence and the cross-cutting nature of this treatment across all specialties, and one of the selected core procedures to connect these sessions. A faculty-created video reviewing the key components of the informed consent process was provided to students prior to the session. We created cases using the case structure template, including assessment checklist, from the Clinical Performance Examination encounters that are administered to all students at the end of the core clerkship period. A minimum passing standard for each case was defined by six clinical skills faculty experts using the modified Angoff method [Citation15]. Each student completed two patient encounters, with completion of the assessment checklist and immediate feedback provided by the standardized patient.

We structured the sessions addressing the performance of patient handoffs to include two separate activities, one involving group handoffs and the other involving individual handoffs. We created a high-fidelity simulation session that entailed alternating groups of students caring for the same patient who develops different conditions over time. At the conclusion of each scenario, two students would provide a verbal signout using the I-PASS structure [Citation16] to another group of students, who would then proceed to enter the simulation room and care for the same patient with a new complaint or condition. The individual patient handoff activity entailed students viewing one of two videos detailing a simulated patient encounter. Students watched these videos asynchronously, and then worked in pairs to provide and receive signout using the I-PASS structure under direct observation by a faculty instructor who then provided formative feedback. We used the handoff clinical evaluation exercise tool created by Horwitz and colleagues for both of these sessions [Citation17].

Setting

The study was performed at the medical school and on-campus simulation center. All graduating senior medical students were required to successfully complete this course at its inception in 2017 and every year thereafter. We defined the passing criteria as attending and participating in all scheduled sessions, and achieving a score at or above the minimum passing standard for the informed consent OSCE, which was introduced in 2019. This study was adjudged to meet standards for exemption from consent for human subjects research by the Institutional Review Board (IRB#20–000094).

Study design

We investigated whether our EPA-based course for assessment improved learner self-assessed ratings for readiness for day-to-day resident activities using a retrospective pre-postquestionnaire at the conclusion of the course [Citation18,Citation19]. As a part of the formal course evaluation, we asked students to rate how confident and how prepared they felt to start internship prior to the course and at course completion on a 4-point Likert scale: 1=not at all [confident/prepared], 2=slightly [confident/prepared], 3=moderately [confident/prepared], and 4=very [confident/prepared]. We also asked them to rate how comfortable they felt to perform each of the assessed EPAs prior to the course and at course completion using the same 4-point Likert scale. We developed and sent another survey to the learners midway through their PGY-1 year asking them to rate how well the capstone course had prepared them for internship and for each of the EPAs that were assessed. We included in this survey an optional open-text question allowing the respondents to identify the most effective components of the course.

We also examined the ability of our course to enhance medical knowledge using a pretest-posttest design with a high-fidelity simulation curriculum provided to all students in small groups working with individual faculty instructors. All students in the Adult Medicine track completed a 23-item multiple choice question (MCQ) pre-test prior to participating in the simulation curriculum intervention, and the same MCQ assessment (with items reordered) as a post-test immediately at the conclusion of the week-long course. We developed this 23-item examination using the critical action item checklists for the core conditions covered by the simulation curriculum, and each of the items was reviewed and refined with iterative feedback from multiple faculty educator experts until consensus on all items was achieved.

Statistical analysis

We used the paired Symmetry test to evaluate differences between retrospective pre- and post-course self-assessed ratings by specialty track. To determine if any differences between pre- and post-course self-assessed ratings existed between each year of data collected (3 course iterations in 2018, 2019, and 2020), we used a mixed effects logistic regression model with terms for time (prior to the course and after the course), year, the interaction between time and year, and a random intercept for the respondent. Due to small sample sizes, this analysis could not be conducted for the pediatric track. We used the paired Student’s t-test to evaluate differences in student performance from pretest to posttest. All analyses were performed using SAS statistical software, version 9.4 (SAS Institute, Cary, NC).

Results

Our findings demonstrate that learners’ readiness for residency increased after the course. During our data collection period from 2018–2020 (3 course iterations), a total of 481 graduating seniors participated in the capstone course, with 311 students in the Adult Medicine track, 126 students in the Surgery track, and 44 students in the Pediatric track. Four hundred and fifty-two students (94%) completed the course evaluation. In the Adult Medicine track, 298 students (96% response) completed the course evaluation. There was a statistically significant change in their self-assessed ratings of preparation to start internship from pre- to post-course, with the percentage of students rating themselves as moderately or very prepared increasing from 61.4% (n = 183) to 88.6% (n = 264) with a p-value of <0.001 (). Of the students who participated in the Surgery track, 111 (88% response) completed the course evaluation. There was a similar statistically significant change, with the percentage of students rating themselves as moderately or very prepared increasing from 56.8% (n = 63) to 81.1% (n = 90) with a p-value of <0.001. In the Pediatric track, 43 students (98% response) completed the course evaluation, which revealed a similar trend with those moderately or very prepared to start internship increasing from 32.6% (n = 14) to 83.7% (n = 36) with a p-value of 0.02. The analysis of each year of data did not demonstrate any significant differences in the change in self-assessed ratings between course iterations by year for the adult medicine and surgery tracks.

Figure 1. Retrospective pre-post learner self-assessed ratings of preparedness for residency and the performance of entrustable professional activities.

Figure 1. Retrospective pre-post learner self-assessed ratings of preparedness for residency and the performance of entrustable professional activities.

A similar change was noted for participants in all three tracks in self-assessed ratings of comfort from prior to after the course for performing basic procedures, managing core complaints, performing the initial triage of nursing pages, performing a patient handoff, and obtaining informed consent for basic procedures ( and Appendix B). To highlight a few of the findings, for obtaining informed consent for basic procedures, in the adult medicine track, the change in the total percentage of students who rated themselves as moderately or very comfortable from pre- to post-course was 65.8% (n = 189) to 93.7% (n = 268) with a p-value of <0.001; in the surgery track, the change was 73.2% (n = 79) to 95.3% (n = 103) with a p-value of < 0.001; in the pediatric track, the change was 53.5% (n = 23) to 93.0% (n = 40) with a p-value of 0.08. For the initial triage of nursing pages, in the adult medicine track, the change in the total percentage of students who rated themselves as moderately or very comfortable from before to after the course was 49.1% (n = 142) to 84.8% (n = 244) with a p-value of < 0.001; in the surgery track, the change was 52.4% (n = 56) to 80.3% (n = 86) with a p-value of < 0.001; in the pediatric track the change was 44.2% (n = 19) to 81.4% (n = 35) with a p-value of 0.07. The analysis of each year of data did not demonstrate any significant differences in the change in self-assessed ratings for each EPA between course iterations by year for the adult medicine and surgery tracks.

For the questionnaire that was sent to graduates 6 months into their PGY-1 year, 75 (23%) of a possible 325 respondents completed the survey (). Of the participants, 69.3% (n = 52) responded that the capstone course prepared them moderately or very well for day-to-day patient care as a resident, 69.3% (n = 52) responded that the capstone course prepared them moderately or very well for the management of basic complaints, 77.3% (n = 58) responded that the capstone course prepared them moderately or very well for the performance of patient handoffs, and 86.6% (n = 65) responded that the capstone course prepared them moderately or very well to obtain informed consent for basic procedures.

Table 1. Results of questionnaire to learners midway through the PGY-1 year.

For the open text question regarding the most effective components of the bootcamp course, 46 (61.3%) of the 75 respondents specified at least one course component: 21 individuals (45.7%) identified the basic procedures sessions, 13 individuals (28.3%) noted the patient handoff activities, 11 (23.9%) highlighted the informed consent activity, and 11 (23.9%) identified the high-fidelity simulation sessions addressing the management of basic complaints.

Of the 203 students who participated in the adult medicine track from 2019–2020, 200 students (99%) completed both the pre- and post-test knowledge assessments (). The mean performance improved from 65.0% (SD 12.0%) to 77.5% (SD 9.2%) with a p-value of < 0.001. The Cronbach’s alpha for the knowledge assessment was 0.48.

Figure 2. Medical knowledge multiple choice question assessment adult medicine track.

Figure 2. Medical knowledge multiple choice question assessment adult medicine track.

Discussion

This study showed that a wholly experiential EPA-based capstone course for assessment leads to improved learner preparedness for residency and provides valuable quantitative feedback for UME programs. The experiences and results from this course at our institution serve as content for reflection in the parallel efforts by many others across the nation to provide optimal preparation for the high-stakes transition to residency training. In reviewing the literature, another shorter capstone course addressing 3 EPAs also focused on assessment and demonstrated success in student mastery of these EPAs using a hands-on curriculum [Citation3]. One other medical school published a descriptive study without formal assessment that aimed to tackle the mastery of all EPAs during the third and fourth years of medical school using simulation. Their experience was well-received among students and educators [Citation20]. One notable result of our study was the high degree of comfort that students felt obtaining informed consent after the course, with at least 93% in each track specifying that they were moderately or very comfortable with this activity. The AAMC pilot study had identified informed consent as one of the core EPAs that is least ready for trainee entrustment at the start of residency [Citation2].

The positive results from our study do inform the question of how best to structure residency preparatory courses. We developed our bootcamp as a completely experiential learning course with assessment as the focal purpose. We provided formative assessment with each activity but only after the students performed each EPA under direct observation. The rationale for this pedagogical approach is that the antecedent 3-plus years of medical school curriculum should result in students possessing honed skills to perform many core EPAs by the start of the capstone course.

Using the retrospective pre-post approach in assessing students’ self-assessed ratings of confidence and preparedness to start internship, and of comfort with performing multiple EPAs, our results demonstrate clear improvement in these short-term Kirkpatrick Level 1 outcomes. However, it is both interesting and concerning that a large percentage of the senior students retrospectively rated themselves as slightly or not at all confident (adult medicine 42.6% [n = 127]; surgery 43.2% [n = 48]; pediatric 65.1% [n = 28]) or prepared (adult medicine 36.9% [n = 110]; surgery 42.3% [n = 47]; pediatric 67.5% [n = 29]) to start internship pre-course. We observed a similar finding with the students’ self-assessed ratings of comfort with performing the specific EPAs addressed by the course.

While these Kirkpatrick Level 1 outcomes rely on student self-assessment, the results of the MCQ knowledge assessment provide a more objective measure (Kirkpatrick Level 2) that aligns with the aforementioned retrospective pre-post findings. The students in the adult medicine track received a high-fidelity simulation curricular intervention that led to a statistically significant improvement in performance on the posttest. However, the mean performance of the students on the pretest was only 65% (SD 12%) with a range of 39% to 91%. This MCQ knowledge assessment addressed diagnostic and management decision-making for common adult complaints and conditions that all of the students in the adult medicine track will encounter during their PGY-1 year. Some of the variability in student performance may be a product of the diversity of specialties that are represented in the adult medicine track, which usually includes students matching into psychiatry, neurology, anesthesiology, emergency medicine, family medicine, ophthalmology, and internal medicine.

There are several additional explanations that might help to interpret these findings. A likely contributing factor is increasing student anxiety as the start of residency training becomes more temporally proximate. Additionally, as we situated this required bootcamp course in early March of the 4th year, and a majority of students had completed their more rigorous clinical requirements several months prior to the course, there could be a component of skills decay. A number of curricular groups have shared recommendations on specific coursework that should be completed by students entering different specialties [Citation21,Citation22], but the timing of such coursework during the 4th year may be equally as important. Rather than the senior year building to a crescendo just before the residency interview period, followed by an extended duration of lower intensity curricular experiences and culminating with graduation, a case can be made for a biphasic approach in clinical intensity with a ramping up toward the end of the 4th year in preparation for the start of residency. The last explanation is that our clinical curriculum can and needs to be improved both in content and structure to ensure that students are proficient and confident in key clinical skills by the start of the capstone course. This relates to the integration of the core EPAs into all years of the medical school curriculum in a graduated manner with frequent direct observation and assessment. Attempting to provide core education for the performance of EPAs during a capstone course without an integrated curriculum will likely be insufficient and too late. Thus, we believe that for core clinical skills addressed during capstone courses, the focus should be on assessment and not solely on content review.

Results from the survey administered to learners midway through the PGY-1 year may also be useful in informing curricular decisions. Because of the low response rate (23%), it is very difficult to make any clear inferences. However, the respondents noted that the capstone course prepared them most for the following clinical skills: obtaining informed consent, performing a patient handoff, and performing the initial triage of nursing pages. All of these are advanced skills integrating communication skills, medical knowledge, and clinical reasoning, requiring guided practice for mastery. These and other advanced skills should be identified and addressed earlier in the medical school curriculum utilizing principles of experiential learning with intentional direct observation and assessment. One such example is described by Vermylen and colleagues in the embedding of simulation-based mastery learning for breaking bad news into a medicine sub-internship [Citation23], a session that we have incorporated into later versions of this course.

There are several limitations with our study. The findings reflect the experience at only a single institution, albeit with multiple years of data. Additionally, the measurement involves mainly short-term, lower-level Kirkpatrick outcomes. The longer-term, Kirkpatrick Level 1 outcome relied on a survey that was sent out to all of the learners midway through their PGY-1 year, but the response rate was poor, and findings could be biased. A more useful measure would include the performance of our graduating medical students as assessed by core residency faculty using the ACGME milestones during their PGY-1 year compared to other interns from different medical schools. We hope to explore this measure with future iterations of our capstone course. Another limitation relates to the high resource requirement of the course. Given the major aim of this course was assessment, we required a large number of instructors to provide direct observation of all participating students. Finally, we utilized checklists for formative feedback but did not study interrater reliability, which could potentially decrease the reliability of these results.

There are a number of future directions for consideration based on our study and experience. The first relates to the timing of the course. We anticipate moving the capstone course as close to graduation as possible to ensure that the impact of the course is maximal in the transition to residency. The second consideration involves structuring the clinical curriculum to include spaced repetition of the performance and assessment of many of the core EPAs to prevent skills decay. This could include the creation of recurring simulation sessions scattered throughout the 4th year from which the students are required to attend a certain number. The third consideration is to develop and adapt additional specialty-specific EPAs to the course with sessions led by faculty experts in each specialty and working with students who have matched into the specialty. Finally, another important consideration is performing an assessment of interrater reliability utilizing the checklists for the course, particularly if summative assessment is identified as an important priority in the future.

In summary, we have demonstrated that an experiential learning capstone course to assess EPAs can be effective both in the short-term and long-term to improve learner self-assessed ratings of readiness for residency training and acquisition of medical knowledge. Based on our findings, we also emphasize the importance of integrating EPAs throughout the medical school curriculum as well as providing robust assessment during these capstone experiences to optimize the transition from student to resident.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

Data is available on Figshare, DOI 10.6084/m9.figshare.24956904.

Additional information

Funding

Data analysis was supported by the National Center for Advancing Translational Sciences [UCLA CTSI Grant Number UL1TR001881].

References

  • Englander R, Aschenbrener CA, Call SA, et al. Core entrustable professional activities for entering residency. MedEdPORTAL. 2014 [cited 2019 Dec 5]. Available from: https://www.mededportal.org/icollaborative/resource/887
  • Amiel JM, Andriole DA, Biskobing DM, et al. Revisiting the core entrustable professional activities for entering residency. Acad Med. 2021;96(7S):S14–S21. doi: 10.1097/ACM.0000000000004088
  • Salzman DH, McGaghie WC, Caprio TW, et al. A mastery learning capstone course to teach and assess components of three entrustable professional activities to graduating medical students. Teach Learn Med. 2019;31(2):186–10. doi: 10.1080/10401334.2018.1526689
  • Tischendorf J, O’Connor C, Alvarez M, et al. Mock paging and consult curriculum to prepare fourth-year medical students for medical internship. MedEdPORTAL. 2018;14:10708. doi: 10.15766/mep_2374-8265.10708
  • Lerner V, Higgins EE, Winkel A. Re-boot: simulation elective for medical students as preparation bootcamp for obstetrics and gynecology residency. Cureus. 2018;10(6):e2811. doi: 10.7759/cureus.2811
  • Elnicki DM, Gallagher S, Willett L, et al. Course offerings in the fourth year of medical school: how U.S. medical schools are preparing students for internship. Acad Med. 2015;90(10):1324–1330. doi: 10.1097/ACM.0000000000000796
  • Kirkpatrick DL, Kirkpatrick JD. Evaluating training programs: the four Levels. San Francisco: Berrett-Koehler Publishers; 2006.
  • Bontempo LJ, Frayha N, Dittmar PC. The internship preparation camp at the University of Maryland. Postgrad Med J. 2017;93(1095):8–14. doi: 10.1136/postgradmedj-2015-133882
  • Boehler ML, Schwind CJ, Markwell SJ, et al. Mock pages are a valid construct for assessment of clinical decision making and interprofessional communication. Ann Surg. 2017;265(1):116–121. doi: 10.1097/SLA.0000000000001575
  • Humphrey-Murto S, Varpio L, Wood TJ, et al. The use of the Delphi and other consensus group methods in medical education research: a review. Acad Med. 2017;92(10):1491–1498. doi: 10.1097/ACM.0000000000001812
  • Berg K, Riesenberg LA, Berg D, et al. The development of a validated checklist for adult lumbar puncture: preliminary results. Am J Med Qual. 2013;28(4):330–334. doi: 10.1177/1062860612463225
  • ACS/ASE medical student simulation-based surgical skills curriculum. [cited 2020 May 7]. Available from: http://web2.facs.org/medicalstudents/landing.cfm
  • Airway skills station competency checklist. Published 2007. [cited 2020 May 7]. Available from: http://ahainstructornetwork.americanheart.org/idc/groups/ahaecc-public/@wcm/@ecc/documents/downloadable/ucm_315939.pdf
  • Basic Life Support for Healthcare Providers. Published 2015. [cited 2020 May 7]. Available from: https://www.redcross.org/content/dam/redcross/atg/Landing_Pages/BLS/BLS_Handbook__Final_.pdf
  • De Champlain AF. Standard setting methods in medical education. In: Swanwick T, editor. Understanding medical education: evidence, theory and practice. 2nd ed. Oxford, UK: John Wiley & Sons; 2014. pp. 305–316.
  • Starmer AJ, Spector ND, Srivastava R, et al. Changes in medical errors after implementation of a handoff program. N Engl J Med. 2014;371(19):1803–1812. doi: 10.1056/NEJMsa1405556
  • Horwitz IH, Rand D, Staisiunas P, et al. Development of a handoff evaluation tool for shift-to-shift physician handoffs: the handoff CEX. J Hosp Med. 2013;8(4):191–200. doi: 10.1002/jhm.2023
  • Skeff KM, Stratos GA, Bergen MR. Evaluation of a medical-faculty development program - a comparison of traditional pre/post and retrospective pre/post self-assessment ratings. Eval Health Prof. 1992;15(3):350–366. doi: 10.1177/016327879201500307
  • Bhanji F, Gottesman R, de Grave W, et al. The retrospective pre-post: a practical method to evaluate learning from an educational program. Acad Emerg Med. 2013;19(2):189–194. doi: 10.1111/j.1553-2712.2011.01270.x
  • Herrigel DJ, Donovan C, Goodman E, et al. Simulation as a platform for development of entrustable professional activities: a modular, longitudinal approach. Cureus. 2020 [cited 2020 Oct 22];12(10):e11098. doi: 10.7759/cureus.11098
  • Lyss-Lerman P, Teherani A, Aagaard E, et al. What training is needed in the fourth year of medical school? Views of residency program directors. Acad Med. 2009;84(7):823–829. doi: 10.1097/ACM.0b013e3181a82426
  • Angus S, Vu TR, Halvorsen AJ. What skills should new internal medicine interns have in July? A national survey of internal medicine residency program directors. Acad Med. 2014;89(3):432–435. doi: 10.1097/ACM.0000000000000133
  • Vermylen JH, Wayne DB, Cohen ER, et al. Promoting readiness for residency: embedding simulation-based mastery learning for breaking bad news into the medicine sub-internship. Acad Med. 2020;95(7):1050–1056. doi: 10.1097/ACM.0000000000003210

Appendices

Appendix A. Example critical action item checklist for adult medicine track core condition

Appendix B.

Additional results of retrospective pre-post learner self-assessed ratings of preparedness for residency and the performance of Entrustable Professional Activities