494
Views
0
CrossRef citations to date
0
Altmetric
Educational Case Reports

A Simple and Sustainable Exercise to Enhance Student Self-Reflection on Error-Making, Focus Support, and Guide Curricular Design

ORCID Icon, , ORCID Icon & ORCID Icon
Pages 65-72 | Received 15 Mar 2021, Accepted 11 Jan 2022, Published online: 23 Feb 2022
 

Abstract

Problem: Self-reflection is a critical component of professional development and clinical practice, but medical students’ ability to self-reflect is typically limited. While inadequate self-reflection impacts future clinical decision-making, it may also adversely impact current learning through an inability to identify learning-behavior deficits. This may be exacerbated by common use of multiple-choice questions (MCQ) where incorrect responses provide less insight than other measures for students, faculty, or academic support. To address this, an Error Reflection Method (ERM) was developed to help students focus on ‘why’ they got an MCQ wrong rather than ‘what’ they got wrong, thereby promoting self-reflection and a learning-focus on assessment. Understanding students’ learning-behavior deficits could also enrich engagement with academic support services and guide curricular design. Intervention: The ERM is a list of 10 common types of exam errors that were either ‘test-taking’ (unwitting) errors or ‘learning-behavior’ errors that reflected learning deficits. The ERM is simple, transferable, and sustainable, allowing longitudinal and regular monitoring of individual and collective error-making to focus support and guide curricular development. Context: Undergraduate medical students at the Virginia Tech Carilion School of Medicine, USA, used the ERM in formative assessment review sessions in pre-clinical years to select an error type that best described the cause of each incorrect response. Impact: Initial findings suggest the ERM is robust and associated with improved student performance and curricular development. Analysis of 3,775 student-identified errors showed the error types in the ERM described 96% of errors students made. Learning-behavior errors were more common (76%), but surprisingly, 19% were test-taking errors, allowing academic support to focus on test-taking skills in a population previously thought of as consummate test-takers. The most common error type reported was ‘the content looked familiar but I couldn’t answer the question’ (32%); which we suggest is consistent with shallow learning. This finding has helped steer recent curricular development toward active and applied learning techniques. Lessons Learned: By formally and regularly identifying learning deficits, students may be more capable of addressing them and improve summative exam performance. As well as focusing academic support, understanding common student errors has been useful in guiding curricular design and content delivery. Further potential of the ERM may be realized in faculty development and directing assessment culture toward a learning focus.

Acknowledgements

We would like to acknowledge and thank Dani Backus for keeping the project on track, numerous faculty for helpful initial discussion, particularly Dr. Kathy Dorey. This project was approved by the Virginia Tech Institutional Review Board, Protocol #17-723.

Disclosure statement

No potential conflict of interest was reported by the authors.

Funding

The author(s) reported there is no funding associated with the work featured in this article.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 65.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 464.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.