Abstract
Context
A programmatic approach to assessment entails gathering and aggregating ‘rich information’ on candidates to inform progress decisions. However, there is little guidance on how such an approach might be implemented in practice.
Objective
We describe an approach to aggregating rich information across assessment formats to inform committee decision-making in a specialist medical college.
Methods
Each item (n = 272) for every examination was blueprinted to 15 curriculum modules and 7 proficiencies. We developed a six-point holistic rating scale with detailed rubrics outlining expected performance standards for every item. Examiners used this rating scale in making judgements for each item, generating rich performance data for each candidate.
Results
A colour-coded ‘mosaic’ of patterns of performance across modules and proficiencies was generated along with frequency distributions of ratings. These data allowed examiners to easily visualise candidate performance and to use these data to inform deliberations on borderline candidates. Committee decision-making was facilitated by maintaining the richness of assessment information throughout the process. Moreover, the data facilitated detailed and useful feedback to candidates.
Conclusions
Our study demonstrates that incorporating aspects of programmatic thinking into high-stakes examinations by using a novel approach to aggregating information is a useful first step in reforming an assessment program.
Acknowledgements
We would like to thank Haldor Aamot and the Royal Australasian College of Dental Surgeons for supporting this project.
Author contributions
Authors JP, KR, NC and DH designed and implemented the work described. JP provided the initial conceptualization of this manuscript and wrote the first draft. Authors KR and NC made significant intellectual contributions and edits to subsequent drafts, with KR being chiefly responsible for the final editing of the manuscript. All authors approved the final version.
Disclosure statement
The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.
Glossary
Programmatic thinking: Allows for the incorporation of aspects of a programmatic approach into assessments and assessment processes, without requiring a full-scale implementation of the learner-based pedagogy and instructional design elements in the original programmatic model.
Additional information
Notes on contributors
Jacob Pearce
Jacob Pearce, PhD, is a Senior Research Fellow in Tertiary Education (Assessment) at the Australian Council for Educational Research, Australia.
Katharine Reid
Katharine Reid, PhD, is the Director of Evaluation and Quality in the Department of Medical Education at the University of Melbourne, and a Senior Research Fellow in the Educational Monitoring and Research Division at the Australian Council for Educational Research.
Neville Chiavaroli
Neville Chiavaroli, MEd MPhil, is a Principal Research Fellow in Assessment and Selection Research at the Australian Council for Educational Research, Australia.
Dylan Hyam
Dylan Hyam, BDS(Hons), MBBS(Hons), FRACDS(OMS), is the Director of the Oral and Maxillofacial Surgery Unit at the Canberra Hospital and Adjunct Associate Professor in the School of Dentistry at Charles Sturt University.