Abstract
The ratings that judges or examiners use for determining pass marks and students' performance on OSCEs serve a number of essential functions in medical education assessment, and their validity is a pivotal issue. However, some types of errors often occur in ratings that require special efforts to minimise. Rater characteristics (e.g. generosity error, severity error, central tendency error or halo error) may present a source of performance irrelevant variance. Prior literature shows the fundamental problems in student performance measurement attached to judges’ or examiners’ errors. It also indicates that the control of such errors supports a robust and credible pass mark and thus, accurate student marks. Therefore, for a standard-setter who identifies the pass mark and an examiner who rates student performance in OSCEs, proper, user-friendly feedback on their standard-setting and ratings is essential for reducing bias. This feedback provides useful avenues for understanding why performance ratings may be irregular and how to improve the quality of ratings. This AMEE Guide discusses various methods of feedback to support examiners' understanding of the performance of students and the standard-setting process with an effort to make inferences from assessments fair, valid and reliable.
Keywords:
Disclosure statement
The authors report no conflicts of interest. The authors alone are responsible for the content and writing of this article.
Additional information
Notes on contributors
Mohsen Tavakol
Mohsen Tavakol, MSc PhD, MClinEd, is an associated professor in Psychometrics. His main interests are in medical education assessment, assessment feedback, psychometric analysis (Classical Test Theory, Generalisability theory, Item Response Theory Models), robust statistical methods, multivariate statistics, Structural Equation Modelling, Meta-analysis, quantitative and qualitative research methods and communication skills.
Brigitte E. Scammell
Brigitte E. Scammell, DM, MClinEd, is Dean and Head of School of Medicine and a professor of Orthopaedic Sciences. Her main interest in medical education is in admissions, equity and student experience.
Angela P. Wetzel
Angela P. Wetzel, PhD, currently serves as the director of assessment and as a faculty member within the Foundations of Education department for the Virginia Commonwealth University School of Education. Wetzel’s broad research interests include instrument development, assessment and program evaluation. Wetzel provides leadership for the development, implementation and analysis of student assessments, completer outcomes, and program and unit evaluations to guide data-driven decision making and to support program improvement in the School of Education and for partner programs in the College of Humanities and Sciences and School of the Arts.