2,602
Views
3
CrossRef citations to date
0
Altmetric
Research Article

Chinese doctors’ views on workplace-based assessment: trainee and supervisor perspectives of the mini-CEX

ORCID Icon & ORCID Icon
Article: 1869393 | Received 03 Sep 2020, Accepted 22 Dec 2020, Published online: 31 Dec 2020

ABSTRACT

Purpose: This study investigated whether the mini-clinical evaluation exercise (mini-CEX) has been successfully integrated into the Chinese context, following its introduction as part of the national general training programme.

Materials and methods: Online questionnaires (N = 91) and interviews (N = 22) were conducted with Year 1 trainee doctors and clinical supervisors at a cancer hospital in China to explore users’ experiences, attitudes and opinions of the mini-CEX.

Results” Trainees were more likely than supervisors to report understanding the purpose of the mini-CEX and agree that it encouraged reflection and helped improve overall performance. Both trainees and supervisors felt that it provided a framework for learning, that it was useful in identifying underperformance, and that it informed learning progression. Groups were equally positive about the commitment of their counterpart in the process and valued the focus on detailed feedback. It was perceived as cultivating the learner–teacher relationship. Overall, both groups felt they ‘bought in’ to using the mini-CEX. However, concerns were raised about subjectivity of ratings and lack of benchmarking with expected standards of care.

Conclusions: Chinese trainees and supervisors generally perceived the mini-CEX as an acceptable and valuable medical training tool, although both groups suggested enhancements to improve its efficacy.

Introduction

The nationwide implementation of the Standardised Resident Training Program in 2017 [Citation1], introduced a range of workplace-based assessments (WBAs) into Chinese postgraduate medical education [Citation2]. The ‘mini-CEX’ (mini clinical evaluation exercise) is one such assessment [Citation3,Citation4], involving direct observation of a trainee-patient encounter, with ratings and feedback given by the supervisor across domains such as ‘history-taking’, physical examination, communication, clinical judgement, professionalism and organisation. With strong theoretical validity, the mini-CEX is used internationally [Citation5,Citation6].

When implemented properly, the mini-CEX has positive educational value in clinical training [Citation7,Citation8], but outcomes can be constrained by trainee and supervisor knowledge and attitudes [Citation9]. Acceptance of the mini-CEX differs among users [Citation10,Citation11], but few studies have examined concordance of trainee and supervisor perspectives [Citation12,Citation13].

Furthermore, the acceptability of educational innovation may determine the utility of the mini-CEX in new cultural contexts [Citation14,Citation15]. Studies in southeast Asia have indicated variability in shared understanding and acceptance of the mini-CEX [Citation16,Citation17]. Little is known about how the mini-CEX is implemented in China, as few studies have discussed its validity and reliability [Citation18,Citation19]. As the mini-CEX was designed for use in Western healthcare education, this highlights a need for further investigation into its acceptability and use in practice.

This study therefore aimed to investigate perceptions of the mini-CEX among Chinese trainees and supervisors, particularly considering users’ understanding of this assessment and attitudes towards it.

Materials and methods

Participants

All Year 1 trainees and supervisors of Year 1 trainees at a teaching hospital in southeast China were invited to participate. Ninety-one doctors participated: 43/75 (57%) trainees and 48/120 (40%) supervisors; groups were 53% and 63% male, with a median age of 25 years and 52 years, respectively. Participants were based in radiation oncology (38%), surgery (31%), medicine (13%), anaesthetics (12%), gynaecology (3%), radiology (1%) and ultrasound (1%). Supervisors were 26 speciality registrars (years 4–6 following primary medical qualification), 17 consultants (minimum 7 years post-qualification) and 5 chief consultants (minimum 10 years post-qualification). A convenience sub-sample of 22 participants for interviews was recruited: 11 trainees and 11 supervisors; groups were 5/11 and 6/11 male, with a median age of 26 years and 38 years, respectively. Supervisors were six speciality registrars and five consultants ().

Table 1. Participant demographics

Setting

The host institution was a teaching hospital specialising in the radiological, surgical and medical treatment of adult cancer situated in one of the most developed cities in mainland China. This served a mixed urban and rural population comprising demographically and ethnically diverse communities. Since 2017, The institution introduced WBAs as supplement to the annual exam, which consisted of a knowledge test and Objective Structured Clinical Examination [OSCE). The teaching faculty was therefore offered training of WBAs – including mini-CEX – on an annual basis.

Measures

A 15-item questionnaire was designed to examine trainees’ and supervisors’ experiences, attitudes and understanding of the mini-CEX, including questions with high reliability adapted from [Citation20,Citation21]. The questionnaire showed good internal consistency, achieving Cronbach’s alpha of 0.95 for both groups [Citation22].

Semi-structured schedules including 9–12 questions for a 30-min interview were devised to explore participants’ perspectives in more depth [Citation9]. Pilot testing of measures was conducted with 12 volunteers (6 trainees, 6 supervisors).

Procedure

Participant recruitment was conducted through advertisements (via all-staff email, teaching sessions and staff noticeboards) providing a link to the online questionnaire. An invitation to participate in the interviews was included at the end of the questionnaire; volunteers were recruited until the requisite number was reached. Interviews were conducted in a meeting room at the host institution, scheduled as convenient for the participants and audio-recorded with permission.

Qualitative analysis of the interview data was undertaken using principles described by [Citation23]. The interviews were transcribed verbatim by a researcher (YL) to capture the richness of the data set, and read and re-read to identify latent themes reflecting participants’ perceptions, interpretations and attitudes. Initial codes were extracted and free coded in NVivo, before the data were actively searched for themes, and codes grouped into themes by consensus (YL and LN).

Ethics and data processing

The project was approved by UCL Ethics Committee and the Ethics Committee of Affiliated Cancer Institution and Hospital of Guangzhou Medical University. All participants confirmed informed written consent. Questionnaires were returned anonymously. Audio-recorded interview data were deleted immediately upon transcription. Analysed data contained no identifiable information.

Results

Trainee and supervisor experience of the mini-CEX

The majority of trainees (35/43, 81%) indicated a maximum of two previous assessments with the mini-CEX, whilst supervisors had had more experience ().

Table 2. Participants’ experience of the mini-CEX

Trainee and supervisor perceptions of the mini-CEX from the questionnaire

Trainees and supervisors differed in their responses (Mann-Whitney U, z = 3.6, p < 0.001). Trainees were more likely to report a clear understanding of the purpose of the mini-CEX (60% ‘agree’ or ‘strongly agree’) compared to supervisors (48%) (). On a practical level, trainees were also more likely to report that supervisors initiated the mini-CEX (79% trainees, 48% supervisors) and that there was sufficient time for each assessment (70% trainees, 48% supervisors). The groups showed similar levels of agreement that the patient encounters were directly observed (65% trainees, 67% supervisors) and that feedback was provided after each encounter (70% trainees, 69% supervisors), although supervisors were more likely to report that feedback was immediate (75% supervisors, 65% trainees). The majority of participants in both groups reported that feedback was specific (81% trainees, 75% supervisors) and highlighted weaknesses in performance (74% trainees, 69% supervisors).

Trainees were more likely to report that the mini-CEX encouraged them to reflect (77% trainees, 60% supervisors) and helped to improve their performance (74% trainees, 56% supervisors). The majority of both groups felt that the score reflected the quality of the trainee’s performance (65% trainees, 69% supervisors), that the mini-CEX was useful in identifying underperformance (77% trainees, 70% supervisors), and that it informed learning progression (74% trainees, 69% supervisors). The groups showed similar levels of agreement that their counterpart in the process showed commitment during the assessment (67% trainees, 69% supervisors) and that overall, they ‘bought in’ to the use of the mini-CEX (70% trainees, 69% supervisors).

Trainee and supervisor perceptions of the mini-CEX from the interviews

Themes related to perceptions of the mini-CEX, its perceived educational impact and attitudes towards its acceptability ().

Table 3. Trainee and supervisor perspectives of the mini-CEX

Table 4. Themes identified from the trainee and supervisor interviews

Perceptions of the mini-CEX

Participants reported diverse perceptions of the purpose of the mini-CEX, in terms of its various roles as an aid to learning, trainee progression or administrative monitoring. Trainees and supervisors expressed concerns about the reliability of scoring, noting the subjectivity of ratings, and that overly generous scores may be given as a means of encouraging trainees. Interviewees commonly highlighted the need for further training and benchmarking of scores to expected standards.

There was general consensus that initiation of assessment by mini-CEX was the responsibility of the clinical supervisor, although there were diverse views about its timing. This partly reflected supervisors’ views on its formative role, e.g. as a periodic check on progress, or to review the standard of performance attained towards the end of an attachment. However, service pressures also presented challenges to supervisors’ availability.

Almost every interviewee confirmed that trainees were always observed and that observations were used to generate feedback. While supervisors used observation to identify specific areas to improve, trainees felt that being watched helped to create a sense of learning.

Similarly, almost all interviewees agreed that feedback was given after each patient encounter, and all confirmed that feedback was honest. The majority of interviewees reported that feedback was detailed and specific, often providing explicit examples. It was accepted that a key aim of feedback was to correct incorrect practice, in the context of a positive and encouraging approach.

Educational impact of mini-CEX

Interviewees consistently described the mini-CEX as a framework for learning and teaching, particularly noting its comprehensiveness and value in ensuring that all domains were covered. Positive benefits were identified in terms of its effects in stimulating reflection, strengthening the learner-teacher relationship, providing a sense of personal achievement, and aligning learning with clinical practice.

Acceptability of mini-CEX

No trainees showed negative attitudes towards the assessment, although one reported that some supervisors were ‘sloppy’ in their approach. Two supervisors had misgivings about its structured approach and focus on a single encounter. Generally interviewees reported ‘engaged commitment’ and expressed a willingness to ‘try it more’. Four interviewees expressed a neutral stance, feeling the mini-CEX was generally acceptable, but that some refinement (such as standardisation) was needed.

Discussion

Attitudes towards the mini-CEX

The present study found a strong positive attitude towards the mini-CEX among the majority of trainees and supervisors. Trainees valued the mini-CEX more than supervisors, perhaps due to the direct impact on their training, which echoes some earlier studies [Citation20,Citation24,Citation25]. Although some investigators have discussed poor engagement with the mini-CEX [Citation21,Citation26], the majority of participants perceived their counterparts as showing a committed attitude. This may be dependent on who is chosen as the assessor [Citation12,Citation21]. In this context, the clinical supervisor is the default assessor, and the pre-existing relationship may enhance commitment. Traditional Chinese culture values a close bond between the learner and teacher [Citation27]. The present study found that participants viewed organising and leading the mini-CEX as the supervisors’ responsibility, with the trainees’ job to learn and correct mistakes. Such notions create a consensus, resulting in mutual commitment in teaching and learning. Paradoxically, cultures that value doctor autonomy and independence may be less conducive to observation and constructive feedback between members [Citation28,Citation29].

Mediated tool for teaching and learning in the workplace

The majority of trainees and supervisors perceived the mini-CEX as meaningful, and helpful in avoiding oversights. Teaching and learning in clinical settings are often unstructured due to clinical demands [Citation30]. Any tools that clarify the processes for both learners and teachers are, therefore, essential for learning [Citation31]. The mini-CEX may be particularly helpful for clinical supervisors because the majority are not professional teachers. Its utility in recognizing strengths and underperformance in a number of domains suggests it has value in identifying teaching opportunities across a broad spectrum [Citation32]. The structured approach also helps with assessment, aiding supervisors in recalling their observations and translating their impressions into ratings [Citation33].

The mini-CEX promotes learning through hands-on experience with real-world tasks [Citation25,Citation34]. This is likely to be the greatest advantage perceived by Chinese trainees, as hands-on opportunity is often limited by the strict power distance in social norms as well as the low-trust relationship between Chinese patients and junior doctors [Citation35].

Making sense of assessment through interaction

This study confirmed previous findings that observing performance allows supervisors to identify learning needs [Citation36] and generate feedback that promotes learning [Citation37]. Supervisors were identified as active information processors and goal directors [Citation33,Citation38], sharing their expectations with trainees following observation [Citation39,Citation40].

Trust is pivotal when analysing performance. Observation offers trainees feedback on the emotional climate of the doctor-patient encounter [Citation39,Citation41]. Trainees also know that supervisors will help if needed. For supervisors, direct observation allows them to ensure patient safety while gaining insight into the trainee’s progress [Citation39].

Both trainees and supervisors valued the element of feedback, which is consistent with evidence that feedback is a powerful tool for learning [Citation42,Citation43]. ‘Task-oriented’ feedback dominated, consistent with previous studies showing that for trainees, content-specific feedback is a ‘shared conversation’, essential to making sense of learning goals and aiding understanding [Citation44].

Most published literature criticises the ‘polite feedback culture’ for withholding honest and constructive feedback in order to ‘save face’ [Citation45]. However, this study indicated that users valued feedback focusing on identifying weaknesses to improve performance. This could reflect cultural norms regarding roles in learning, in which teachers are leaders, guides and masters [Citation27]. This finding also resonates with recent research in East Asian countries that support the educational value of the mini-CEX [Citation15,Citation16]. Traditional teacher-centred Chinese and medical cultures highlight a power differential [Citation15,Citation27,Citation29,Citation46]. Within this context, a supervisor’s expectations and goals represent parts of the curriculum [Citation41] and influence the trainee’s motivation for learning. Consequently, feedback serves as a mediating tool about expectations of acceptable performance [Citation47].

Study participants noted the subjectivity of mini-CEX scoring, which can compromise its utility [Citation17,Citation21]. This highlights a tension between criteria that focus on authenticity and trustworthiness rather than psychometric models that strive to objectify performance and yield reproducible, generalisable judgements [Citation2,Citation10]. Users appeared to operate from differing assumptions of how ‘standardised’ assessments should be [Citation48]. While many trainees felt confident about the mini-CEX rating standard, fewer supervisors felt the same. This divergence may reflect the implicit perception that ratings are summative [Citation2]. This poses a dilemma for supervisors, who also felt a need to encourage trainees at their formative career stage with lenient scoring. It is possible that supervisors held beliefs that were more consistent with traditional principles of scientific measurement [Citation48]. Historically, medicine and science have reflected a positivist paradigm, in which scientific measurement aims to find objective truth [Citation49]. Consequently, it may be unsurprising that supervisors are concerned about the ‘robustness’ of using narrative text for the final judgement [Citation2]. However, the majority of trainees admitted that they were not as concerned about the score because feedback was more direct and useful. Previous studies have not adequately discussed this, possibly because participants believe that judgement in performance assessments such as the mini-CEX is ‘in situ’ and, therefore, can only be understood in context [Citation37]. Trainees may see variations in rating as a source of information about their performance, interpreted as a reflection of supervisors’ experiences [Citation50]. Thus, trustworthiness and authenticity become key players in constructing the meaning and criteria of assessment [Citation2,Citation51]. These are the two most important attributes in Chinese cultural norms for teaching and learning [Citation46], which may mediate users’ concerns with ‘objective ratings’.

Challenges

The mini-CEX is predicated on the notion that assessment drives learning [Citation8,Citation52]. However, the current study indicated that participants did not fully understand the formative purpose of the mini-CEX, particularly supervisors, a fact that has been acknowledged in numerous studies [Citation8,Citation12,Citation21]. [Citation53], noted that stakeholders’ beliefs in the assessment shape the way they use it. The current study suggests that participants see the mini-CEX as serving mixed purposes. As the majority of mini-CEX evaluations occur towards the end of a rotation, both groups were aware that the assessment outcome could impact the trainee’s progression. Despite these worries, both parties used the mini-CEX as a diagnostic tool and means of identifying underperformance.

The current study identified that time constraints could undermine the perceived educational benefits of the mini-CEX. Trainees may challenge the value of a mini-CEX conducted retrospectively or without direct observation, a repeated finding in global studies of mini-CEX implementation [Citation54].

Implications and future research

The present study indicated that Chinese participants held positive views about the educational value of the mini-CEX, but there were unresolved concerns about its purpose (formative/summative) and the subjectivity of scoring. Further research could examine how users navigate the potentially competing aims of this process. Future studies may consider other healthcare settings in China [Citation14,Citation55] to examine generalisability of these findings in this new cultural context.

Conclusion

This study echoed previous research suggesting that formative purpose, direct observation and real-time feedback are crucial components for ensuring effective utilisation of the mini-CEX. Overall, the mini-CEX was welcomed and accepted by Chinese users, although users did not completely understand the notion of formative purpose. The educational value of the mini-CEX appears to be strongly associated with the prevalent cultural norm of learner–teacher relationships based on trust. This study showed that direct observation creates a sense of safety for learning, and moreover, observation is bidirectional as it allows trainees to observe supervisors illustrating corrections.

Some unique cultural attributes about feedback characteristics were revealed, such as preference for mentioning weaknesses and demonstrating correction.

Adequate training for all participants and sufficient time allocation for the mini-CEX were highlighted as implementation needs.

Practice points

  • The utility of mini-CEX is dependent on its acceptability among users and shared understanding of its purpose.

  • Overall, the mini-CEX is welcomed and generally accepted in China, with trainees somewhat more positive than supervisors.

  • A trustworthy learning-teaching relationship enhanced the educational value of the mini-CEX.

  • Faculty training and ensuring time allowance can optimise the full educational potential of the mini-CEX.

Acknowledgments

The authors would like to thank the Affiliated Cancer Hospital and Institute of Guangzhou Medical University and all participants for supporting this project.

Disclosure statement

No potential conflict of interest was reported by the authors.

References

  • National Council of China. Statement on the reform and development of medical education. Beijing; 2017. Available from: http://www.gov.cn/zhengce/content/2017-07/11/content_5209661.htm
  • Burch VC. The changing landscape of workplace-based assessment. JATT. 2019;20(S2):37–8.
  • Norcini JJ, Blank LL, Arnold GK, et al. The mini-CEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med. 1995;123(10):795–799.
  • Norcini JJ, Blank LL, Duffy FD, et al. The mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138(6):476–481.
  • Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach. 2007 Nov;29(9):855–71. doi:https://doi.org/10.1080/01421590701775453. PMID: 18158655
  • General Medical Council. Workplace based assessment: A guide for implementation. London; 2010. www.gmc-uk.org/Workplace_Based_Assessment. pdf_31300577.pdf
  • Mortaz HS, Jalili M, Masoomi R, et al. The utility of mini-clinical evaluation exercise in undergraduate and postgraduate medical education: A BEME review: BEME Guide No. 59. Med Teach. 2020;42(2):125–142.
  • Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE guide no. 31. Med Teach. 2007;29(9–10):855–871.
  • Lörwald AC, Lahner F-M, Nouns ZM, et al. Factors influencing the educational impact of Mini-CEX and DOPS: a qualitative synthesis. Med Teach. 2018;40(4):414–420.
  • de Jonge LP, Timmerman AA, Govaerts MJ, et al. Stakeholder perspectives on workplace-based performance assessment: towards a better understanding of assessor behaviour. Adv Health Sci Educ. 2017;22(5):1213–1243.
  • Cohen L, Manion L, Morrison K. Research methods in education. London: Routledge; 2007.
  • Massie J, Ali JM. Workplace-based assessment: a review of user perceptions and strategies to address the identified shortcomings. Adv Health Sci Educ. 2016;21(2):455–473.
  • Gipps C. Beyond testing (classic edition): towards a theory of educational assessment. London: Routledge; 2011.
  • Lewis CC, Proctor E, Brownson RC. Measurement issues in dissemination and implementation research. OUP, USA: Dissemination and implementation research in health: Translating science to practice; 2017. p. 261–280. https://doi.org/10.1093/oso/9780190683214.003.0014.
  • Suhoyo Y, Van Hell EA, Kerdijk W, et al. Influence of feedback characteristics on perceived learning value of feedback in clerkships: does culture matter? BMC Med Educ. 2017;17(1):69.
  • Sudarso S, Rahayu GR, Suhoyo Y. How does feedback in mini-CEX affect students’ learning response? Int J Med Educ. 2016;7:407.
  • Yanting SL, Sinnathamby A, Wang D, et al. Conceptualizing workplace based assessment in Singapore: undergraduate Mini-Clinical Evaluation Exercise experiences of students and teachers. Tzu Chi Med J. 2016;28(3):113–120.
  • Jiang C, Huang L, Zhu Y. Using Mini-CEX in the assessment of the training for rehabilitation residents. Chin High Health Educ. 2016;1:4–6.
  • Jie G, Shen Y, Jiang Y, et al. Application of mini-CEX in community rotation of general practice residency training. Chin J Gen Pract. 2015;14(11):849–853.
  • Bindal T, Wall D, Goodyear HM. Trainee doctors’ views on workplace-based assessments: are they just a tick box exercise? Med Teach. 2011;33(11):919–927.
  • Weston PS, Smith CA. The use of mini-CEX in UK foundation training six years following its introduction: lessons still to be learned and the benefit of formal teaching regarding its utility. Med Teach. 2014;36(2):155–163.
  • Howitt D, Cramer D. Introduction to research methods in psychology (2nd edition). Upper Saddle River: Person/Prentice Hall; 2007.
  • Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.
  • Weller J, Jolly B, Misur M, et al. Mini-clinical evaluation exercise in anaesthesia training. Br J Anaesth. 2009;102(5):633–641.
  • Billett S. Workplace participatory practices: conceptualising workplaces as learning environments. J Workplace Learn. 2004;16(6):312–324.
  • Miller A, Archer J. Impact of workplace based assessment on doctors’ education and performance: a systematic review. BMJ. 2010;341:c5064.
  • Chan ZC, Tong CW, Henderson S. Power dynamics in the student-teacher relationship in clinical settings. Nurse Educ Today. 2017;49:174–179.
  • Watling C. The uneasy alliance of assessment and feedback. Pers Med Educ. 2016;5(5):262–264.
  • Watling C, Driessen E, van der Vleuten CP, et al. Beyond individualism: professional culture and its influence on feedback. Med Educ. 2013;47(6):585–594.
  • Eraut M. Informal learning in the workplace. Stud Contin Educ. 2004;26(2):247–273.
  • Vygotsky, L. S. (1978). Mind in society. The development of higher psychological processes. Cambridge: Harvard University Press
  • Fokkema JP, Teunissen PW, Westerman M, et al. Exploration of perceived effects of innovations in postgraduate medical education. Med Educ. 2013;47(3):271–281.
  • Govaerts MJ, van der Vleuten CP, Schuwirth LW, et al. Broadening perspectives on clinical performance assessment: rethinking the nature of in-training assessment. Adv Health Sci Educ. 2007;12(2):239–260.
  • Wenger E. Communities of practice: learning, meaning, and identity. Cambridge: Cambridge University Press; 1999.
  • Lancet T. Violence against doctors: why China? Why now? What next. Lancet. 2014;383(9922):1013.
  • Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9):S63–67.
  • Govaerts M, van der Vleuten CP. Validity in work‐based assessment: expanding our horizons. Med Educ. 2013;47(12):1164–1174.
  • Sadler DR. Formative assessment: revisiting the territory. Assess Educ. 1998;5(1):77–84.
  • Rietmeijer CB, Huisman D, Blankenstein AH, et al. Patterns of direct observation and their impact during residency: general practice supervisors’ views. Med Educ. 2018;52(9):981–991.
  • Sheehan D, Wilkinson TJ, Billett S. Interns’ participation and learning in clinical environments in a New Zealand hospital. Acad Med. 2005;80(3):302–308.
  • Scaife J. Supervision in the mental health professions: a practitioner’s guide. New York: Routledge; 2003.
  • Carless D, Salter D, Yang M, et al. Developing sustainable feedback practices. Stud High Educ. 2011;36(4):395–407.
  • Hattie J, Timperley H. The power of feedback. Rev Educ Res. 2007;77(1):81–112.
  • Sargeant JM, Mann KV, van der Vleuten CP, et al. Reflection: a link between receiving and using assessment feedback. Adv Health Sci Educ. 2009;14:399–410.
  • Ramani S, Post SE, Könings K, et al. “It’s just not the culture”: a qualitative study exploring residents’ perceptions of the impact of institutional culture on feedback. Teach Learn Med. 2017;29(2):153–161.
  • Lam TP, Wan XH, Ip MS. Current perspectives on medical education in China. Med Educ. 2006;40(10):940–949.
  • Paul A, Gilbert K, Remedios L . Socio-cultural considerations in feedback. 2013. http://arrow.monash.edu.au/vital/access/manager/Repository/monas
  • Shepard LA. The role of assessment in a learning culture. Educ Res. 2000;29(7):4–14.
  • Mann KV. Theoretical perspectives in medical education: past experience and future possibilities. Med Educ. 2011;45(1):60–68.
  • Schuwirth LW, van der Vleuten CP. A plea for new psychometric models in educational assessment. Med Educ. 2006;40(4):296–300.
  • Guba EG, Lincoln YS. Fourth generation evaluation. Thousand Oaks:: Sage Publications, Inc; 1989.
  • Van der Vleuten CP. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ. 1996;1(1):41–67.
  • Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009;302(12):1316–1326.
  • Wilkinson JR, Crossley JG, Wragg A, et al. Implementing workplace-based assessment across the medical specialties in the United Kingdom. Med Educ. 2010, 42(4):364–373. https://doi.org/10.1111/j.1365-2923.2008.03010.x
  • Century J, Cassata A. Implementation research: finding common ground on what, how, why, where, and who. Rev Res Educ. 2016;40(1):169–215.