880
Views
0
CrossRef citations to date
0
Altmetric
Letter

Evaluation drives curriculum

, MD
Page 425 | Published online: 28 Mar 2012

Dear Sir

It is almost a mantra in education that evaluation drives curriculum. Students focus not on topics that they are told are important but on those that are formally evaluated. A recent change in the evaluation methods in our internal medicine clerkship demonstrated the truth of this relationship.

In our medicine clerkship, students’ final evaluations were made up of the weighted average of clinical assessments, small group preceptors’ evaluations, and the score on the USMLE subject exam. Although our grading system was fair and reliably differentiated the strongest from the weakest students, it did not adequately assess the primary objective of our clerkship, the acquisition of clinical reasoning skills.

Not surprisingly, because clinical reasoning skills were underrepresented in our assessment, students did not focus on the acquisition of this skill (Newble & Jaeger Citation1983). In an effort to turn our students’ attention to clinical reasoning, we designed an oral exam to evaluate reasoning skills. The students were told the 20 topics on which they could, potentially, be tested on the first day of the rotation and we recommended problem-based resources that would be effective preparation.

To quantify the effect of the intervention, we surveyed the students. We asked, “What were the three most valuable resources you used during the clerkship?” We divided the answers into resources that were primarily board reviews (such as MKSAPS for Medical Students), disease-oriented texts (such as Harrison's Principles of Internal Medicine and UpToDate), and problem-based books, those that are most focused on teaching clinical reasoning (such as Symptom to Diagnosis: An Evidenced Based Guide and Case Files: Internal Medicine).

We compared survey data from year one, before instituting the oral examination, and year two, after instituting the exam. Use of board review books was listed 112 times in year one, falling to 58 times in two. Problem-based books, those aimed at teaching clinical reasoning, were listed 48 times in year one compared to 80 times in year two. Use of disease-oriented resources was unchanged, being listed 83 times in year one and 78 times in year two.

While the ability of our oral examination to actually test students’ clinical reasoning is difficult to demonstrate, it is clear that the addition of the oral examination altered students study habits. A well-designed assessment tool can be used to influence students’ studying in a way desired by course directors.

Reference

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.