Abstract
Assessment of clinical skills is a critical element of undergraduate medical education. We compare a traditional approach to procedural skills assessment – the Objective Structured Clinical Examination (OSCE) with the Integrated Performance Procedural Instrument (IPPI). In both approaches, students work through ‘stations’ or ‘scenarios’ undertaking defined tasks. In the IPPI, all tasks are contextualised, requiring students to integrate technical, communication and other professional skills. The aim of this study was to explore students’ responses to these two assessments. Third‐year medical students participated in formative OSCE and IPPI sessions on consecutive days. Although performance data were collected in both assessments, quantitative data are not presented here. Group interviews with students were conducted by independent researchers. Data were analysed thematically. The OSCE and the IPPI were both valued, but for different reasons. Preference for the OSCE reflected the format of the summative assessment. The IPPI was valued for the opportunity to practise patient‐centred care in a simulated setting which integrated technical, communication and other professional skills. We posit that scenario‐based assessments such as the IPPI reflect real‐world issues of patient‐centred care. Although the limitations of this study prevent wide extrapolation, we encourage curriculum developers to consider the influence of assessments on what and how their students learn.
Acknowledgements
We thank Nedra DuBroff who conducted the interviews.
Notes
1. At the time of this project all authors worked in the Department of Biosurgery and Surgical Technology, Imperial College, except for Kash Akhtar who was undertaking postgraduate studies in surgical education in the department.