889
Views
11
CrossRef citations to date
0
Altmetric
Articles

Candidates undertaking (invigilated) assessment online show no differences in performance compared to those undertaking assessment offline

, , &
Pages 646-650 | Published online: 18 Feb 2021
 

Abstract

Background

Medical education has historically relied on high stakes knowledge tests sat in examination centres with invigilators monitoring academic malpractice. The COVID-19 pandemic has made such examination formats impossible, and medical educators have explored the use of online assessments as a potential replacement. This shift has in turn led to fears that the change in format or academic malpractice might lead to considerably higher attainment scores on online assessment with no underlying improvement in student competence.

Method

Here, we present an analysis of 8092 sittings of the Prescribing Safety Assessment (PSA), an assessment designed to test the prescribing skills of final year medical students in the UK. In-person assessments for the PSA were cancelled partway through the academic year 2020, with 6048 sittings delivered in an offline, traditionally invigilated format, and then 2044 sittings delivered in an online, webcam invigilated format.

Results

A comparison (able to detect very small effects) showed no attainment gap between online (M = 0.762, SD = 0.34) and offline (M = 0.761, SD = 0.34) performance.

Conclusions

The finding suggests that the transition to online assessment does not affect student performance. The findings should increase confidence in the use of online testing in high-stakes assessment.

Glossary

Anchor Items: A set of identical items deployed across multiple assessments. The use of anchor items enables comparisons between cohorts and can be used to directly compare performance (or progress) even when a majority of the items on each assessment are not shared.

Pibal F, Cesnik HS. 2011. Evaluating the quantity-quality trade-off in the selection of anchor items: a vertical scaling approach. Pract Assess Res Eval. 16(1):6.

Disclosure statement

The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

Additional information

Notes on contributors

David Hope

Dr. David Hope, PhD, is a senior lecturer in medical education at the University of Edinburgh medical school. His primary area of interest is assessment – especially ensuring the reliability and validity of high-stakes assessment and minimising differential attainment throughout medical education.

Veronica Davids

Veronica Davids, BSc, is the assistant director for the Medical Schools Council.

Lynne Bollington

Dr. Lynne Bollington, PhD, is a pharmacist and is the lead consultant for the Prescribing Safety Assessment. Her expertise focuses on creating and delivering national high-stakes assessments for doctors and pharmacists.

Simon Maxwell

Professor Simon Maxwell, PhD, has been Medical Director of the UK Prescribing Safety Assessment since 2010 and is Professor of Student Learning at the University of Edinburgh.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 65.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 771.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.