Abstract
This study explores the reliability, validity and acceptability of assessment criteria for reflective portfolios at Peninsula Medical School, UK. The construct validity of the criteria was established by exploring its relationship with two other assessment methods and its reliability was determined using the generalizability (G) coefficient. Focus groups with assessors and students were convened to explore their views of the portfolios. Two portfolio analyses had a G coefficient of .42. Performance in the portfolio analyses was correlated with performance in personal and professional development judgements (r = − .512, p < .01) and scientific reports (r = .273, p = .002). Themes emerging from the focus groups include students preferring the structured nature of the portfolios but assessors feeling that this reduced the uniqueness of the portfolios. Although students understood the importance of reflective practice, some disliked the process of reflection, particularly reflective writing. Educators should design assessment criteria that maximize reliability, validity and acceptability rather than simply focusing on reliability alone.
Acknowledgements
We would like to thank Dr Chris Ricketts, Lecturer in Clinical Education and Statistician at Peninsula Medical School, for his invaluable advice on using generalizability theory. We would also like to thank Kay Allen, Administrative Assistant at Peninsula Medical School, for entering the quantitative data on SPSS and Serena Vellacott, Plymouth Locality Secretary at Peninsula Medical School, for transcribing the focus groups.
A Teaching Development Award from the University of Exeter, awarded to the first author, funded the qualitative study.