Abstract
Context: Increasing pressure is being placed on external accountability and cost efficiency in medical education and training internationally. We present an illustrative data analysis of the value-added of postgraduate medical education.
Method: We analysed historical selection (entry) and licensure (exit) examination results for trainees sitting the UK Membership of the Royal College of General Practitioners (MRCGP) licensing examination (N = 2291). Selection data comprised: a clinical problem solving test (CPST); a situational judgement test (SJT); and a selection centre (SC). Exit data was an applied knowledge test (AKT) from MRCGP. Ordinary least squares (OLS) regression analyses were used to model differences in attainment in the AKT based on performance at selection (the value-added score). Results were aggregated to the regional level for comparisons.
Results: We discovered significant differences in the value-added score between regional training providers. Whilst three training providers confer significant value-added, one training provider was significantly lower than would be predicted based on the attainment of trainees at selection.
Conclusions: Value-added analysis in postgraduate medical education potentially offers useful information, although the methodology is complex, controversial, and has significant limitations. Developing models further could offer important insights to support continuous improvement in medical education in future.
Glossary Term: Value-Added
The concept of “value-added” in an education system relates to growth in knowledge, skills, abilities, and other attributes that students have gained as a result of their experiences in an education system over time (Leckie & Goldstein Citation2011). Value-added measures are used to estimate or quantify how much of a positive (or negative) effect an individual institution has on student learning during a given period of training (Abbot Citation2014), accounting for a range of intake differences.
Acknowledgements
We gratefully acknowledge staff at the GP National Recruitment Office and the Royal College of General Practitioners (RCGP) for assisting in data access, particularly Kamilla Hawthorne, chair of the assessment committee at the RCGP.
Disclosure statement
The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.
Notes on contributors
Simon Gregory, MMedEd, FRCGP, FAcadMEd, FRCPE, FHEA, is a Director and Dean of Education and Quality, Health Education England (Midlands and East), Fellow Homerton College, Cambridge and a Visiting Professor University of East Anglia and Anglia Ruskin University.
Fiona Patterson, BSc, MSc, PhD, CPsychol, AcSS, FRSA, FCMI, FRCGP (Hon), is a Professor of Organisational Psychology with expertise in selection and assessment. She is a Principal Researcher at the University of Cambridge & founding Director for Work Psychology Group, an international research-led organizational psychology consulting practice.
Helen Baron, CPsychol, CSci, AFBPsS, is a chartered psychologist, and associate of Work Psychology Group. As well as developing instruments she has been involved in the validation and review of both her own and others’ instruments. She has worked both in the public and private sector helping organizations to evaluate and refine their selection procedures.
Alec Knight, is an Associate of Work Psychology Group and Postdoctoral Fellow in Improvement Science at the Institute of Psychiatry, Psychology & Neuroscience, King’s College London. His main research interest is occupational psychology applied to the healthcare context.
Kieran Walsh, is Clinical Director of BMJ Learning and Quality at BMJ. He is responsible for the editorial strategy of medical education and quality improvement at BMJ.
Bill Irish, BSc, MB, MMEd, FRCGP, is a Postgraduate Medical Dean at Health Education East of England, based in Cambridge. He is a visiting professor of medical education at the universities of Bristol and East Anglia. His interests include high stakes assessment of doctors, and medical recruitment.
Sally M. Thomas, BSc, PhD(CNAA), is Professor of Education at the Graduate School of Education, University of Bristol. She conducts research on “value added” measures of school and educational effectiveness, and the application of these measures for different purposes including: institutional improvement and self-evaluation, international quality indicators and academic knowledge-based research.