2,096
Views
1
CrossRef citations to date
0
Altmetric
Research Articles

Feasibility, and validity aspects of Entrustable Professional Activity (EPA)-based assessment in general practice training

ORCID Icon, , , &
Pages 69-76 | Received 26 Nov 2020, Accepted 26 Jun 2021, Published online: 20 Aug 2021

ABSTRACT

Introduction

Entrustable Professional Activities (EPAs) are developed to support the practical application of competency frameworks in postgraduate medical education (PGME) programmes. EPAs are used for the assessment of the trainees’ competence development, which takes place by means of an entrustment decision, aiming to stimulate learning and independent practice in trainees. In this pilot study, we explore the feasibility and validity of EPA-based assessment in a General Practice (GP) training programme.

Methods

We used questionnaires to evaluate trainers’ and trainees’ experiences with the use of six EPAs for trainee learning, assessment and independent practice at the Out-of-Hours GP Center. Data were analysed quantitatively and qualitatively. Additionally, we examined the inter-item correlation between scores on EPA-based assessment and competency-based assessment using Spearman’s Rho.

Results

EPA-based assessment provided opportunities for giving concrete feedback and substantiating competency-based assessment. No consistent correlation between EPA-based assessment and competency-based assessment could be detected. Only later in the course of the training programme a correlation was found between the EPA scores and the degree of independence of trainees.

Discussion

Results of this pilot study confirm the theories behind EPAs, as well as earlier research on EPAs in the workplace regarding trainee learning, assessment and independent practice. An important limitation of this study was the COVID-19 pandemic, as it influenced the results through reduced inclusion and follow-up, and through the impact on the workplace and trainee learning possibilities. Further research is needed to determine how EPAs support independent practice of trainees, as well as the assessment of trainee competency development.

Introduction

Postgraduate medical education (PGME) programmes prepare trainees to become independently working medical specialists [Citation1]. Working independently in the workplace is important for the professional development of trainees [Citation2,Citation3]. Trainees should therefore increasingly be entrusted with performing clinical activities unsupervised during the course of their training programme [Citation4].

The process of gradually decreasing supervision and increasing the autonomy and responsibility of trainees can be supported by using Entrustable Professional Activities (EPAs) [Citation5–7]. EPAs were developed to support the practical application of a PGME programmes’ competency framework in the workplace. Every EPA describes a clinical task that is performed in daily practice with reference to the competencies needed to be able to perform the task, and allows for concrete feedback on that task [Citation5,Citation7,Citation8]. Assessment of EPAs takes place through an entrustment decision, allowing trainees to grow from working under full supervision via more ‘remote supervision’ to performing patient care independently without supervision from their trainer [Citation6–8]. Since EPAs help to prepare trainees for unsupervised practice, their description of the clinical task should provide a clear picture of the performance expectations and cover all aspects of (competencies involved in) performing the task [Citation8,Citation9]. Only then can EPAs can support learning by helping trainees to formulate learning goals and supporting trainers in providing meaningful feedback [Citation8,Citation9].

So far, many PGME programmes have introduced EPAs in their curriculum, but high-quality evidence on the actual use of EPAs by trainers and trainees in the workplace is lacking [Citation10,Citation11, Citation12]. Our aim with this pilot study is to explore the application of EPAs in practice. By focusing on feasibility-aspects, related to the usability of EPAs in the workplace, and validity-aspects, displaying the correlation between EPA-scores and competency-scores and the value of EPAs for trainers and trainees, we hope to provide a basis for further research into the practical application of EPAs for trainee learning, assessment and independent practice. We conducted our study in a General Practice (GP) PGME programme.

Methods

Context

Dutch GP training has a 3-year, competency-based training programme, using an adapted version of the CanMEDS competency framework: the ComBeL [Citation13–14]. The ComBeL is an assessment-tool that is applied on a regular basis to assess trainees competency development. In addition, there are 81 EPAs, across 10 themes, reflecting the competencies against the background of the daily clinical activities [Citation15]. The EPAs are currently used to support trainee learning and assessment [Citation13,Citation16–18].

In the Dutch healthcare system GPs play a central role as gatekeepers for the entire healthcare system [Citation19]. GP care is available 24 hours a day, 7 days a week. During the day, patients can see their own GP in the GP’s office. During evenings and nights and during the weekends, patients with urgent medical conditions can see a GP on duty in the Out-Of-Hours General Practice Center (OOH GP Center). In the OOH GP Center patients frequently present with urgent medical conditions and serious illnesses, while at the same time their medical history and their medication use are largely unknown as the patients are not the on-duty GP’s own patients [Citation20]. This makes working in an OOH GP Center more complex than working in the GP’s office. Dutch GP trainees are therefore specifically prepared to work in the OOH GP Center, using the EPAs in the theme ‘Emergency Care’ (Appendix 1, available online) as the basis for both their learning and assessment [Citation17]. During the first year of training, trainees learn to work independently as consulting doctors at the OOH GP Center. Once trainees are deemed competent, usually by the end of their first year, trainers issue a declaration that allows trainees to perform consultations independently at the OOH GP Center [Citation20].

Participants

The study was executed at three of the eight Dutch GP training institutes (Leiden, Maastricht and Amsterdam). Trainees from the first year of training were eligible for participation, since they learn to perform consultations at the OOH GP Center independently during their first year of training. Evaluation of trainees’ fitness for independently performing consultations at the OOH GP Center at the end of the first year of training is based on six EPAs from the theme ‘Emergency Care’ (Appendix 1, available online). All trainees who started GP training in March 2019 and September 2019, as well as their trainers, were eligible for participation. Between July 2019 and March 2020, trainers and trainees from both cohorts received information and an invitation to participate by email from their local training institute. Once trainees had agreed to participate, their trainer was invited to participate. If the trainer also agreed to participate, the trainer-trainee couple was included in the study. Since this study focuses on the effects of EPAs on trainee learning, assessment and working independently, only trainer-trainee couples were eligible for participation.

Study design and data collection

To explore the application of EPAs in practice, we evaluated feasibility and validity aspects. For feasibility aspects, we used an evaluation questionnaire to collect data on trainers’ (Appendix 2, available online) and trainees’ (Appendices 3 and 4, available online) experiences with EPAs. For validity aspects, we used responsiveness, educational validity, and relations with other variables. Responsiveness was evaluated by comparing scores on the EPAs for the 6th and the 9th month of training (Appendix 1, available online). Educational validity was evaluated using data from non-evaluation questionnaires (Appendices 2, 3 and 4, available online) on learning goals formulated after EPA-based assessment and on the usability of EPAs in workplace-based medical education. For the relations with other variables, the EPA-based assessment (Appendix 1, available online) was compared with the overall level of independence at the OOH GP Center (Appendix 1, available online) to evaluate if scores on EPAs aligned with the levels of independence at the OOH GP Center. Additionally, the EPA scores were also compared to the competency-based assessment (ComBeL) to evaluate the correlation between scores on EPAs and related competencies. Data collection for this study was aligned with the standard assessment programme in which formal ComBeL assessment takes place in the sixth and the ninth month. Data were collected using Castor EDC [Citation21].

EPAs and the overall level of independence at the OOH GP Center were assessed on a 4-point scale ranging from one (‘observing the trainer’) to four (‘performing consultations independently with supervision from the trainer remotely available’) [Citation20]. For the assessment of the ComBeL, scores range from one (‘very important point for attention’) to seven (‘hold on to’).

Data analysis

Quantitative data regarding feasibility and validity aspects were analysed using SPSS [Citation22]. Since data were not normally distributed, we reported medians and performed a comparative analysis using non-parametric tests. Participant characteristics were analysed using descriptive statistics.

Qualitative data were evaluated using MAXQDA [Citation23]. Within this study, we adopted a constructivist approach, meaning that we consider truth to be subjective, constructed by individuals based on experiences, context and interactions with others [Citation24]. For the qualitative analysis on feasibility and validity, we performed a conventional content analysis [Citation25]. This analysis was performed by LB; data for trainers and trainees were analysed separately. An inductive coding process was applied to all free-text answers in the questionnaires. Next, a codebook was created to identify all the emerging themes regarding the feasibility and educational validity of EPA-based assessment. The research team extensively discussed the codebook in order to group the themes into categories. This process was supported by the use of memos and diagrams [Citation25–27].

Feasibility

Time invested was analysed using descriptive statistics, completed with the Wilcoxon-signed rank test to test for differences between the 6th and the 9th month of the training programme. Other data on feasibility were analysed qualitatively.

Validity

Responsiveness was determined by comparing the median EPA scores for the overall level of independence at the OOH GP Center for the 6th and 9th month, using a Wilcoxon signed rank test. Data related to the proposed use and interpretation of EPA-based assessment were accommodated under the rubric of educational validity, according to Cook [Citation28,Citation29] and Downing [Citation30]. Educational validity was analysed qualitatively based on the feedback from trainers and trainees on the use of EPAs. An inter-item correlation between the individual scores on the EPAs and the independency score for the trainee working in the OOH GP Center was determined using Spearman’s rho. Additionally, using Spearman’s rho, inter-items correlation between the EPA score and the ComBeL score was calculated.

A correlation >0.2 was considered weak, >0.3 moderate, >0.4 strong, and >0.7 very strong [Citation31].

Results

We included a total of 21 trainer-trainee couples. Quotes that support the results are shown in .

Table 1. Quotes from participants

In total, 34 EPA-based assessments were completed, 17 trainers completed them at the 6th month of training and 17 trainers at the 9th month of training. In total, 20 ComBeL assessments were available, 8 at the 6th month of training and 12 at the 9th month of training.

Feasibility

At the 6th month of training, most trainers (52.9%, n = 9) needed 10–15 minutes to complete the EPAs. At the 9th month of training, most trainers (52.9%, n = 9) needed 5–10 minutes (p = 0.608). Some trainers needed time to prepare the assessment before it could be discussed with the trainee. They preferred to complete the assessment shortly after a shift at the OOH GP Center to evaluate the shift in detail.

In 10 out of 34 EPA assessments (29.4%) trainers had difficulty assessing one or more EPAs. Reasons for the perceived difficulties were either related to an unclear EPA or a lack of sufficient information for assessing trainee functioning. Some trainers also mentioned that specific situations have not yet been sufficiently encountered due to the COVID-19 pandemic.

Other trainers noticed difficulties in assigning independence in specific circumstances since they felt that independence should be restricted to more experienced trainees. As a result, some tasks were not practiced with trainees. Additionally, trainers felt that the independency score scale was not sufficiently nuanced, which made it difficult for them to assess the EPAs.

Validity

Responsiveness

Trainees in the 9th month of the programme showed an increase in independency scores on all EPAs compared with the sixth month.

Educational validity

About half of the trainers thought that EPAs provided a more specific evaluation of trainee functioning in the OOH GP Center through more concrete examples and situations for assessment, thereby allowing for more depth in trainee assessment and better ability to provide targeted feedback. Some trainers thought that EPA-based assessment provided them with additional information for competency-based assessment. Trainees recognised that EPA-based assessment was more concrete, leading to more specific feedback that moves with the conditions and circumstances. This helps trainees to set more concrete and specific learning goals, and to recognise strengths and weaknesses.

Relation with other variables

shows the correlations between the independency score for the trainee working at the OOH GP Center and the scores for individual EPAs. In the 6th month of training three EPAs show a strong to very strong correlation with the independency score, meaning that the scores on these three EPAs correlate well with the level of independence for trainees at the OOH GP Center. In the 9th month all EPAs show a strong to very strong correlation with the independency score. Four correlation coefficients (CCs) in the 6th month and three CCs in the 9th month showed a tendency towards statistical significance.

Table 2. Correlation between independency score for working at OOH GP Center and the EPA scores

(available online) shows the correlation between the EPA and ComBeL scores on the competencies related to the EPAs, displayed in CCs. There was no consistency among the CCs in the 6th and 9th month of training, meaning that scores on EPAs did not consistently correlate with scores on the related competencies. Seven CCs (13.5%) showed a tendency towards statistical significance.

Table 3. Correlation between independency scores per EPA and the ComBeL scores for the CanMEDS roles related to the EPA

Discussion

EPA-based assessment provides a more specific evaluation of trainee functioning, offering opportunities for formulating more targeted feedback and formulating more specific learning goals when compared to competency-based assessment. Scores of EPAs were correlated with the degree of freedom in the independent functioning of trainees at the OOH GP Center. The results of this pilot study confirm the theories behind EPAs, as well as evidence about EPAs in the workplace with regard to trainee learning, assessment and independent practice. However, a relation between EPA-scores and scores on the related competencies could not be discovered. Therefore, further research is needed to assess whether EPAs accurately reflect the assessment of competency development.

Trainee learning

Trainees indicate that feedback from EPAs is more concrete, leading to more specific learning goals that move with conditions and circumstances, providing them with opportunities for adapting their learning to the work that is performed in the workplace and to their own stage of competency development. These results support the assumption that EPAs improve feedback on task performance [Citation9] and the quality of learning goals, and so improve workplace-based learning (37). It has been established that high-quality learning goals are important, specific, measurable, accountable, realistic and performable within a given time frame (ISMART) [Citation32–34].

Assessment

The amount of time needed to complete EPAs decreased but did not reach significance. A third of the trainers experienced difficulties due to unclear EPAs, not sufficiently nuanced EPA scores and a lack of insight into trainee functioning at specific EPAs. The majority of the trainers, however, do think that EPAs provide them with opportunities to formulate more targeted feedback for their trainees. Additionally, both trainers and trainees found EPAs helpful in substantiating the assessment of trainee competency development. As a result, this study underscores the literature on EPAs supporting the practical application of competency frameworks in the workplace [Citation7]. However, in this pilot-study with only a limited amount of participants, no clear correlation was detected between the scores on the EPAs and on the competencies related to those EPAs. As a result, it remains unclear whether EPA-based assessment actually gives a good reflection of the development of trainee competency, since this should be another main feature of EPAs [Citation7]. Trainees in this study were only in the first year of training; in more advanced trainees a correlation between EPA scores and scores of the related competencies could become more clearly recognisable.

Independent practice

In the 6th month of the programme only three EPAs showed a strong correlation between the EPA-score and the overall independency score for trainees working at the OOH GP Center, while in the 9th month all EPAs showed a strong correlation with this score. Trainers in our study stated that they had difficulties with entrusting non-advanced (i.e. first-year) trainees with unsupervised practice at the OOH GP Center, or involving them in specific parts of patient care described in the EPAs, even though their trainee showed sufficient skills. As a result, it seems that the trainees’ proficiency levels as determined using the EPAs are not always taken into account when a trainer decides whether or not to entrust trainees with unsupervised practice at the OOH GP Center. This, though, is exactly what EPAs are meant for providing trainees with the right amount of independence for their proficiency level [Citation5,Citation7,Citation8]. Although there is sufficient literature available on how trainers entrust trainees with independent patient care and how EPAs could support this process [Citation4,Citation35–40], it has also been recognised before that entrustment of independent patient care to trainees is a complex process in which many factors are involved [Citation4,Citation37,Citation40–44]. Task-related factors involved in the entrustment decision, which are the factors that may be influenced by EPA-based assessment, are only a small part of the entrustment decision [Citation41,Citation44,Citation45], which may explain the relatively low impact of EPAs on the opportunities for trainees to perform patient care independently. Sebok-Syer et al. [Citation46] earlier on also recognised that independent practice is not an accomplished fact for many trainees, even with the use of EPAs. This means that the opportunities for trainees to work independently during the training course are still hampered. When trainees do not get the opportunity to experience what it is like to bear responsibility for performing patient care during training it will be difficult to determine whether a trainee is actually ready for unsupervised practice [Citation3,Citation47].

Limitations

During the course of this study, we were confronted with the COVID-19 pandemic. The pandemic did not only influence our study, which resulted in a limited number of participants, making interpreting data and recognising underlying connections more tentative, but it also affected trainees, learning. We therefore assume that our results have been highly influenced by the COVID-19 pandemic.

Other limitations might also have influenced the results. It was performed within the Dutch GP training programme, only evaluating EPAs related to emergency care medicine. Only first-year trainees were included, which may not have resulted in all aspects of EPA-based assessments being properly addressed, and variation being low. Even though this study provides relevant information on the applicability of EPAs in a PGME programme, generalisability of data may be limited for other training programmes.

Future research

Further research is needed to evaluate the feasibility and validity of EPA-based assessment. By including trainees in all years of training, and from various PGME programmes, a better understanding of the value of EPAs in all phases of PGME can be gained. Qualitative research on the use of EPAs could provide a richer insight into the feasibility of EPA-based assessment, as well as a better understanding of its educational validity.

Implications for practice

Implementation of EPAs in a PGME-curriculum allows for an enrichment of trainee learning and assessment by means of creating opportunities to provide a more specific evaluation of trainee functioning and the ability to provide more targeted feedback. Trainees are stimulated to formulate more specific learning goals, adapted to their personal situation. Although it remains unclear how EPAs contribute to providing patient care independently, the enrichment of trainee learning and assessment might be a welcome addition to the currently available assessment methods.

Conclusion

Trainers and trainees find EPAs of added value for trainee learning and for the substantiation of competence-based assessment. Since EPA-based assessment is more specific to particular tasks, feedback for trainees is more targeted and specific, helping them to develop more specific learning goals. As a result, trainee learning is improved. The added value of EPAs for the independent practice of trainees, as well as the assessment of competence development, could not be determined from our pilot-study results.

Ethical approval

This study was approved by the Ethical Review Board of the Netherlands Association for Medical Education (NVMO, file number 2018.6.4). The participants were fully informed of the purpose of the study, participation was entirely voluntary, and individuals were able to withdraw at any time without having to give a reason. All data were collected, stored and processed in pseudonymised form. Prior to starting the questionnaire, all participants signed an informed consent form. Participants were offered a gift card worth €15,00 which they could donate to a charity of their own choice.

Supplemental material

Supplemental Material

Download MS Word (117 KB)

Acknowledgments

The authors thank the participating trainers and trainees for their willingness to contribute to the study.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

Data are available from the authors upon reasonable request.

supplementary-material

Supplemental data for this article can be accessed here.

Additional information

Funding

This work was supported by the Netherlands Organisation for Health Research and Development [839130004].

References

  • Dornan T, Boshuizen H, King N, et al. Experience-based learning: a model linking the processes and outcomes of medical students’ workplace learning. Med Educ. 2007 Jan;41(1):84–91.
  • Van der Zwet J, Zwietering PJ, Teunissen PW, et al. Workplace learning from a socio-cultural perspective: creating developmental space during the general practice clerkship. Adv in Health Sci Educ. 2011;16:359–373.
  • Hirsh DA, Holmboe ES, Ten Cate O. Time to trust: longitudinal integrated clerkships and entrustable professional activities. Acad Med. 2014 Feb;89(2):201–204.
  • Ten Cate OTJ, Hart D, Ankel F, et al. Entrustment decision making in clinical training. Acad Med. 2016;91(2).
  • Ten Cate OTJ. Entrustability of professional activities and competency-based training. Med Educ. 2005 Dec;39(12):1176–1177.
  • Ten Cate OTJ. Nuts and bolts of entrustable professional activities. J Graduate Med Educ. 2013 Mar;5(1):157–158.
  • Ten Cate OTJ, Scheele F. Viewpoint: competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542–547.
  • Ten Cate OTJ, Chen HC, Hoff RG, et al. Curriculum development for the workplace using Entrustable Professional Activities (EPAs): AMEE guide No. 99. Med Teach. 2015 Nov;37(11):983–1002.
  • Peters H, Holzhausen Y, Boscardin C, et al. Twelve tips for the implementation of EPAs for assessment and entrustment decisions. Med Teach. 2017;39(8):802–807.
  • O’Dowd E, Lydon S, O’Connor P, et al. A systematic review of 7 years of research on entrustable professional activities in graduate medical education, 2011–2018. Med Educ. 2019;53:234–249.
  • Shorey S, Ching Lau T, Tiang Lau S, et al. Entrustable professional activities in health care education: a scoping review. Med Educ. 2019;53:766–777.
  • Shaughnessy AF, Sparks J, Cohen-Osher M, et al. Entrustable professional activities in family medicine. J Graduate Med Educ. 2013 Mar;5(1):112–118.
  • Huisartsopleiding Nederland. Nationwide assessment plan. (Available in Dutch: Landelijk Toetsplan 2016). Huisartsopleiding Nederland; 2016.
  • Competentieprofiel en eindtermen van de huisarts. The Netherlands: werkgroep actualiseren Eindtermen en Competenties.
  • Huisartsopleiding Nederland. What is a theme, and what is an Entrustable Professional Activity (EPA)? (Available in Dutch: Wat is een thema en wat is een Kenmerkende Beroepsactiviteit (KBA)?). 2016 [cited 2016 Dec 13]. Available from: https://www.huisartsopleiding.nl/regelgeving/landelijk-opleidingsplan/thema-s-en-kba-s/19-regelgeving/98-1-wat-is-een-thema-en-wat-is-een-kenmerkende-beroepsactiviteit-kba
  • Huisartsopleiding Nederland. Towards a new national training plan for general practice. (Available in Dutch: Naar een Nieuw Landelijk Opleidingsplan Huisartsgeneeskunde) [ cited 2016 Apr 26]. Available from: http://www.huisartsopleiding.nl/regelgeving/landelijk-opleidingsplan
  • Huisartsopleiding Nederland. Themes and EPA’s. (Available in Dutch: Thema’s en KBA’s) 2016 [cited 2017 Jan 27]. Available from: https://www.huisartsopleiding.nl/opleiding/thema-s-en-kba-s
  • Huisartsopleiding Nederland. National training plan for the General Practice training program. (Available in Dutch: Landelijk Opleidingsplan voor de opleiding tot huisarts). 2017. Utrecht: Huisartsopleiding Nederland.
  • Publiek Z. General practice care (Available in Dutch: Huisartsenzorg). 2015. [cited 2018 Aug 20]. Available from: https://www.zorgprismapubliek.nl/informatie-over/huisartsenzorg/huisartsenzorg/
  • Huisartsopleiding Nederland. Trainees at the out-of-hours GP center. (Available in Dutch: AIOS op de huisartsenpost. Leidraad voor het leren dienstdoen.). Utrecht: Huisartsopleiding Nederland; 2016.
  • EDC C. Castor electronic data capture. 2019.
  • IBM Corp. IBM SPSS statistics for windows. 24.0. Armonk, NY: IBM CORP; 2016.
  • MAXQDA, software for qualitative data analysis. Berling, Germany: VERBI Software - Consult - Socialforschung GmbH; 1989–2016.
  • Tavakol M, Sandars J. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: part 1. Med Teach. 2014;36(9):746–756.
  • Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–1288.
  • Pope C, Ziebland S, Mays N. Qualitative research in health care. Analysing qualitative data. BMJ. 2000;320:114–116.
  • Boeije H Analysis in qualitative research. (Available in Dutch: Analyseren in kwalitatief onderzoek. Denken en doen). Vol. Tweede druk. Den Haag: Boom Lemma uitgevers; 2014.
  • Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119:166.
  • Cook DA, Zendejas B, Hamstra SJ, et al. What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Adv Health Sci Educ Theory Pract. 2014;19:233–250.
  • Downing SM. Validity: on the meaningful interpretation of assessment data. Med Educ. 2003;39:830–837.
  • Cohen J. Statistical power analysis for the behavioural scenes. Mahwah, USA: Lawrence Erlbaum Associates, Inc; 1977.
  • Lawlor KB, Hornyak MJ. SMART goals: how the application of SMART goals can contribute to achievement of student learning outcomes. Dev Bus Simul Experiental Learn. 2012;39:259–267.
  • Morrison M. History of SMART objectives. Rapid Bus Improv. 2010. [cited 2020 Nov 03]. Available from: https://rapidbi.com/history-of-smart-objectives/
  • Lockspeiser TM, Schmitter PA, Lane L, et al. A validated rubric for scoring learning goals. J Teach Learn Resour MedEdPORTAL. 2013. DOI:10.15766/mep_2374-8265.9369.
  • Dijksterhuis MG, Voorhuis M, Teunissen PW, et al. Assessment of competence and progressive independence in postgraduate clinical training. Med Educ. 2009 Dec;43(12):1156–1165.
  • Sterkenburg A, Barach P, Kalkman C, et al. When do supervising physicians decide to entrust residents with unsupervised tasks? Acad Med. 2010;85(9):1408–1417.
  • Choo KJ, Arora VM, Barach P, et al. How do supervising physicians decide to entrust residents with unsupervised tasks? A qualitative analysis. J Hosp Med. 2014 Mar;9(3):169–175.
  • Warm EJ, Mathis BR, Held JD, et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med. 2014 Aug;29(8):1177–1182.
  • Ten Cate OTJ. Entrustment as assessment: recognizing the ability, the right, and the duty to act. J Grad Med Educ. 2016;8(2):261–262.
  • Sagasser MH, Fluit CRMG, van Weel C, et al. How entrustment is informed by holistic judgements across time in a family medicine residency program: an ethnographic nonparticipant observational study. Acad Med. 2017;92(6):792–799.
  • Hauer KE, Ten Cate O, Boscardin C, et al. Understanding trust as an essential element of trainee supervision and learning in the workplace. Adv Health Sci Educ Theory Pract. 2014 Aug;19(3):435–456.
  • Ten Cate OTJ. Entrustment decision: bringing the patient into the equation. Acad Med. 2017;92(6):736–738.
  • Ten Cate OTJ, Chen HC. The ingredients of a rich entrustment decision. Med Teach. 2020;42(12):1413–1420. Epub.
  • Bonnie LHA, Visser MRM, Kramer AWM, et al. The mutual trust relationship between trainers and trainees in a workplace-based medical training program. BMJ Open. 2020;10:e:036593.
  • Hauer KE, Oza SK, Kogan JR, et al. How clinical supervisors develop trust in their trainees: a qualitative study. Med Educ. 2015 Aug;49(8):783–795.
  • Sebok-Syer SS, Chahine S, Watling CJ, et al. Considering the interdependence of clinical performance: implications for assessment and entrustment. Med Educ. 2018;52(9):970–980.
  • Touchie C, Ten Cate OTJ. The promise, perils, problems and progress of competency-based medical education. Med Educ. 2016;50(1):93–100.