References
- Anonymous. The spirit of apprenticeship in medicine. JAMA. 1924;83;1337–1338.
- Fromme HB, Karani R, Downing SM. Direct observation in medical education: review of the literature and evidence for validity. Mt Sinai J Med. 2009;76(4):365–371. doi:https://doi.org/10.1002/msj.20123.
- Association of American Medical Colleges (AAMC). Core entrustable professional activities for entering residency. https://www.mededportal.org/icollaborative/resource/887. Published 2013. Accessed February 3, 2020.
- Edgar L, Roberts S, Holmboe E. Milestones 2.0: a step forward. J GME. 2018. doi:https://doi.org/10.4300/JGME-D-18-00372.1.
- LaDonna K, Hatala R, Lingard L, et al. Staging a performance: learners’ perceptions about direct observation during residency. Med Educ. 2017;51(5):498–510. doi:https://doi.org/10.1111/medu.13232.
- Watling C, LaDonna K, Lingard L, et al. Sometimes the work just needs to be done’ socio-cultural influences on direct observation in medical training. Med Educ. 2016;50(10):1054–1064. doi:https://doi.org/10.1111/medu.13062.
- Gauthier S, Melvin L, Mylopoulos M, Abdullah N. Resident and attending perceptions of direct observation in internal medicine: a qualitative study. Med Educ. 2018;52(12):1249–1258. doi:https://doi.org/10.1111/medu.13680.
- Rietmeijer CBT, Huisman D, Blankenstein AH, et al. Patterns of direct observation and their impact during residency: general practice supervisors’ views. Med Educ. 2018;52(9):981–991.,. doi:https://doi.org/10.1111/medu.13631.
- Young J, Sugarman R, Schwartz J, O’Sullivan P. Overcoming the challenges of direct observation and feedback programs: a qualitative exploration of resident and faculty experiences. Teach Learn Med. 2020;32(5):541–551. doi:https://doi.org/10.1080/10401334.2020.1767107.
- Scarff C, Bearman M, Chiavaroli N, Trumble S. Trainees’ perspective of assessment messages: a narrative systematic review. Med Educ. 2019;53(3):221–233. doi:https://doi.org/10.1111/medu.13775.
- Ahmed K, Miskovic D, Darzi A, et al. Observational tools for assessment of procedural skills: a systematic review. Am J Surg. 2011;202(4):469–480. doi:https://doi.org/10.1016/j.amjsurg.2010.10.020.
- Bjorklund KA, Sommer N, Neumeister M, et al. Establishing validity evidence for an operative performance rating system for plastic surgery residents. J Surgical Educ. 2019;76(2):529–539. doi:https://doi.org/10.1016/j.jsurg.2018.08.016.
- Crossley J, Johnson G, Booth J, Wade W. Good questions, good answers: construct alignment improves the performance of workplace-based assessment scales. Med Educ. 2011;45(6):560–569. doi:https://doi.org/10.1111/j.1365-2923.2010.03913.x.
- Donato AA, Park YS, George DL, et al. Validity and feasibility of the minicard direct observation tool in 1 training program. J Grad Med Educ. doi:https://doi.org/10.4300/JGME-D-14-00532.1.
- Olupeliyawa AM, O’Sullivan AJ, Hughes C, Balasooriya CD. The teamwork mini-clinical evaluation exercise (T-MEX): a workplace-based assessment focusing on collaborative competencies in health care. Acad Med. 2014;89(2):359–365. doi:https://doi.org/10.1097/ACM.0000000000000115.
- Gomez-Garibello C, Young M. Emotions and assessment: considerations for rater-based judgements of entrustment. Med Educ. 2018;52(3):254–262. doi:https://doi.org/10.1111/medu.13476.
- Hodwitz K, Kuper A, Brydges R. Realizing one’s own subjectivity: assessor’s perceptions of the influence of training on their conduct of work place bases assessment. Acad Med. 2019;94(12):1970–1979. doi:https://doi.org/10.1097/ACM.0000000000002943.
- Lee V, Brain K, Martin J. Factors influencing Mini-CEX rater judgements and their practical implications: a systematic literature review. Acad Med. 2017;92(6):880–887. doi:https://doi.org/10.1097/ACM.0000000000001537.
- Kogan JR, Conforti L, Bernabeo E, Iobst W, Holmboe E. Opening the black box of clinical skills assessment via observation: a conceptual model. Med Educ. 2011;45(10):1048–1060. doi:https://doi.org/10.1111/j.1365-2923.2011.04025.x.
- St-Onge C, Chamberland M, Lévesque A, Varpio L. Expectations, observations, and the cognitive processes that bind them: expert assessment of examinee performance. Adv Health Sci Educ. 2016;21(3):627–642. doi:https://doi.org/10.1007/s10459-015-9656-3.
- Gauthier G, St-Onge C, Tavares W. Rater cognition: review and integration of research findings. Med Educ. 2016;50(5):511–522. doi:https://doi.org/10.1111/medu.12973.
- Howley LD, William WG. Direct observation of students during clerkship rotations: a multiyear descriptive study. Acad Med. 2004; 79:276–280.
- Kuo AK, Irby DI, Loeser H. Does direct observation improve medical students’ clerkship experiences?Medical Education. 2005;39(5):518. doi:https://doi.org/10.1111/j.1365-2929.2005.02132.x.
- Hasnain M, Connell K, Downing S. Toward meaningful evaluation of clinical competence: the role of direct observation in clerkship ratings. Acad Med. 2004;79(S1):S21–S24.
- Holmboe E. Faculty and the observation of trainees’ clinical skills: problems and opportunities. Acad Med. 2004;79:16–22.
- Kogan JR, Hatala R, Hauer KE, Holmboe E. Guidelines: the do’s, don’ts, and don’t knows of direct observation of clinical skills in medical education. Perspect Med Educ. 2017;6(5):286–305. doi:https://doi.org/10.1007/s40037-017-0376-7.
- Frank JR, Snell LS, Cate OT, et al. Competency medical education: theory to practice. Med Teach. 2010;32(8):638–645. doi:https://doi.org/10.3109/0142159X.2010.501190.
- Gingerich A, Regehr G, Eva K. Rater-based assessments as social judgments: rethinking the etiology of rater errors. Acad Med. 2011. doi:https://doi.org/10.1097/ACM.0b013e31822a6cf8.
- Tavares W, Eva K. Exploring the impact of mental workload on rater-based assessments. Adv Health Sci Educ. 2013;18(2):291–303. doi:https://doi.org/10.1007/s10459-012-9370-3.
- Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009;302(12):1316–1326. doi:https://doi.org/10.1001/jama.2009.1365.
- Gingerich A, Kogan J, Yeates P, et al. Seeing the ‘black box’ differently: assessor cognition from three research perspectives. Med Educ. 2014;48(11):1055–1068. doi:https://doi.org/10.1111/medu.12546.
- George S, Manos S, Wong K. Preparing for CBME: how often are faculty observing residents?Paediatr Child Health. 2021;26(2):88–92. doi:https://doi.org/10.1093/pch/pxz169.
- Engstrom Y. Activity theory and individual and social transformation. In: Engstrom Y, Miettinen R, Punamaki R, eds. Perspectives on Activity Theory. Cambridge, UK: Cambridge University Press; 1999:19–38.
- Hashim NH, Jones ML. Activity theory: a framework for qualitative analysis. http://ro.uow.edu.au/commpapers/408. Published 2007. Accessed July 15, 2020.
- Cook DA, West CP. Conducting systematic reviews in medical education: a stepwise approach. Med Educ. 2012;46(10):943–952. doi:https://doi.org/10.1111/j.1365-2923.2012.04328.x.
- Bilgic E, Watanabe Y, McKendy K, et al. Reliable assessment of operative performance. Am J Surg. 2016;211(2):426–430. doi:https://doi.org/10.1016/j.amjsurg.2015.10.008.
- Chang Y, Lee C, Chen C, et al. Exploring the influence of gender, seniority and specialty on paper and computer-based feedback provision during mini-CEX assessments in a busy emergency department. Adv Health Sci Educ. 2017;22(1):57. doi:https://doi.org/10.1007/s10459-016-9682-9.
- Cheung WJ, Dudek NL, Wood TJ, Frank JR. Supervisor-trainee continuity and the quality of work-based assessments. Med Educ. 2017;51(12):1260–1268. doi:https://doi.org/10.1111/medu.13415.
- Govaerts MJB, Schuwirth LWT, Van der Vleuten CPM, Muijtjens AMM. Workplace-based assessment: effects of rater expertise. Adv Health Sci Educ. 2011;16(2):151–165. doi:https://doi.org/10.1007/s10459-010-9250-7.
- Govaerts MJB, Van de Wiel MWJ, Schuwirth LWT, Van der Vleuten CPM, Muijtjens AMM. Workplace-based assessment: raters’ performance theories and constructs. Adv Health Sci Educ. 2013;18(3):375–396. doi:https://doi.org/10.1007/s10459-012-9376-x.
- Kelleher M, Kinnear B, Sall D. A reliability analysis of entrustment-derived workplace-based assessments. Acad Med. 2020;95:616–622.
- Kreiter CD, Wilson AB, Humbert AJ, Wade PA. Examining rater and occasion influences in observational assessments obtained from within the clinical environment. Med Educ. 2016;21(1):29279. doi:https://doi.org/10.3402/meo.v21.29279.
- Lane SM, Young KA, Hayek SA, et al. Meaningful autonomy in general surgery training; exploring for gender bias. Am J Surg. 2020;219(2):240–244. doi:https://doi.org/10.1016/j.amjsurg.2019.11.035.
- Lee V, Brain K, Martin J. From opening the ‘black box’ to looking behind the curtain: cognition and context in assessor-based judgements. Adv Health Sci Educ. 2019;24(1):85–102. doi:https://doi.org/10.1007/s10459-018-9851-0.
- London Z, Schuh L, Gelb D, Schultz L. Education research: unsatisfactory NEX rating correlations, searching for the reasons. Neurology. 2013;80(13):e142–e14. doi:https://doi.org/10.1212/WNL.0b013e318289702a.
- Meresh E, Daniels D, Sharma A, et al. Review of mini-clinical evaluation exercise (mini-CEX) in a psychiatry clerkship. AMEP. 2018;9:279–283. doi:https://doi.org/10.2147/AMEP.S160997.
- Reinders ME, Blankenstein AH, van Marwijk HW, et al. Reliability of consultation skills assessments using standardized versus real patients. Med Educ. 2011;45(6):578–584. doi:https://doi.org/10.1111/j.1365-2923.2010.03917.x.
- Smith N, Harnett J, Furey A. Evaluating the reliability of surgical assessment methods in an orthopedic residency program. Can J Surg. 2015;58(5):299–304. doi:0:1503/cjs.010614 doi:https://doi.org/10.1503/cjs.010614.
- Weller JM, Misur M, Nicolson S, et al. Can I leave the theatre? A key to more reliable workplace-based assessment. Br J Anaesth. 2014;112(6):1083–1091. doi:https://doi.org/10.1093/bja/aeu052.
- Williams RG, Swanson DB, Fryer JP, et al. How many observations are needed to assess a surgical trainee’s state of operative competency?Ann Surg. 2019;269(2):377–382. doi:https://doi.org/10.1097/SLA.0000000000002554.
- Aubin AS, St-Onge C, Renaud JS. Detecting rater bias using a person-fit statistic: a Monte Carlo simulation study. Perspect Med Educ. 2018;7:83–92.
- Yeates P, O’Neill P, Mann K, Eva K. Seeing the same thing differently. Mechanisms that contribute to assessor differences in directly-observed performance assessments. Adv Health Sci Educ. 2013;18(3):325–341. doi:https://doi.org/10.1007/s10459-012-9372-1.
- Wood TJ. Exploring the role of first impressions in rater-based assessments. Adv Health Sci Educ. 2014;19(3):409–427. doi:https://doi.org/10.1007/s10459-013-9453-9.
- Choo KJ, Arora VM, Barach P, et al. How do supervising physicians decide to entrust residents with unsupervised tasks? A qualitative analysis. J Hosp Med. 2014;9(3):169–175. doi:https://doi.org/10.1002/jhm.2150.
- Gingerich A, Schokking E, Yeates P. Comparatively salient: examining the influence of preceding performances on assessors’ focus and interpretations in written assessment comments. Adv Health Sci Educ. 2018;23(5):937–959. doi:https://doi.org/10.1007/s10459-018-9841-2.
- Grant C, Osanloo A. Understanding, selecting and integrating a theoretical framework in dissertation research: creating the “blueprint” for your house. Adminis Issues J. 2016. doi:https://doi.org/10.5929/2014.4.2.9.