1,168
Views
15
CrossRef citations to date
0
Altmetric
Web Paper

Effects of a supplementary final year curriculum on students’ clinical reasoning skills as assessed by key-feature examination

, , , , , & show all
Pages e438-e442 | Received 09 Aug 2008, Accepted 24 Feb 2009, Published online: 09 Sep 2009

Abstract

Background: The final year of medical education is considered crucial in making students ‘fit for purpose’. Studies have shown that many students leave medical school without having experienced sufficient preparation for their upcoming professional life.

Aim: The aim of this study was to examine the effectiveness of a supplementary internal medicine final year curriculum on clinical reasoning skills.

Method: Final year internal medicine students from two universities participated in the study which was based on a static-group design. The experimental group (n = 49) took part in a final year student curriculum with interactive case-based seminars and skills training sessions. The comparison group (n = 25) did not receive any additional training beyond working on the ward. Clinical reasoning skills were assessed using a key-feature pre-post test.

Results: Prior to their clinical rotation, the two groups did not differ in the key-feature examination (p < 0.924). The experimental group performed significantly better than the comparison group (p < 0.028) in the post-intervention key-feature examination.

Conclusions: Supplementary interactive case-based seminars and skills training sessions are effective and significantly improve the clinical reasoning skills of final year students in internal medicine. Further study is warranted and should look to examine the effectiveness of a final year student curriculum on other performance measures.

Introduction

The final year of medical education is considered to be especially important in making students ‘fit for purpose’ (Wass Citation2005). It aims to aid the transition from students’ medical college years to their future professional career and to produce competent and reflective physicians by ensuring appropriate learning outcomes. Competencies such as clinical skills, practical clinical procedures, patient investigation, communication, information management, patient management and clinical reasoning form the core of final year medical education (Shumway & Harden Citation2003). Nevertheless, studies have shown that many students leave medical school without having experienced sufficient preparation for their upcoming professional life (Clack Citation1994; Fox et al. Citation2000; Goldacre et al. Citation2003; Daelmans et al. Citation2004).

The acquisition of clinical reasoning skills primarily results from dealing with multiple patient examples over a period of time. This facilitates the availability of concepts and conceptual knowledge (Norman Citation2005). Previous studies exploring clinical reasoning skills have predominantly focused on the problem-based learning approach with its constructive, self-directed, collaborative and contextual learning environment (Dolmans et al. Citation2005; Dolmans & Schmidt Citation2006). Compared to traditional, lecture-based sessions and curricula, problem-based learning has been shown to positively affect diagnostic competencies (Schmidt et al. Citation1996) and clerkship performance ratings in undergraduate education (Richards et al. Citation1996; Whitfield et al. Citation2002) as well as clinical reasoning skills in postgraduate education (Doucet et al. Citation1998). For the acquisition of procedural skills, contextual learning is extremely important (Kneebone et al. Citation2002; Nikendei et al. Citation2005; Nikendei et al., Citation2007b). The effectiveness of skills curricula has been demonstrated using self-reports of skills performed on the ward (Remmen et al. Citation1999), written skills tests (Remmen et al. Citation2001) and objective structured clinical examinations in pre-post (Bradley & Bligh Citation1999) and comparison designs (Jünger et al. Citation2005).

No investigation of the effects of a supplementary final year curriculum on clinical reasoning skills is to be found in the literature. The aim of this study was therefore to examine the effectiveness of such a curriculum within the field of internal medicine on students’ performance in a key-feature examination based on a pre-post group comparison design. We hypothesised that (1) key-feature examination results at the beginning of the study would be comparable for both experimental and comparison groups, (2) both groups would show significant improvement in key-feature examination results during the final year, although (3) the experimental group would show significantly more improvement in the post-intervention examination.

Methods

Sample

Two groups of final year medical students belonging to natural cohorts voluntarily participated in the study. The experimental group consisted of 49 final year students from the University of Heidelberg (16 male, 33 female; age: 26.1 ± 1.77 years) and the comparison group consisted of 25 final year students from the University of Tübingen (9 male, 17 female; age: 26.4 ± 1.96 years). Differences in sex and age were not statistically significant (sex: p < 0.445; age: p < 0.510). Both universities demonstrate very good and comparable results in the second national medical licensing examination following the final year of medical education (German NBE part II; (IMPP Citation2007)) and are almost identical in size, department structure, and number and type of patients treated. For both universities, the structure of final year medical education is state-defined. All final year students are required to complete a three-month clerkship in internal medicine. Those final year students who commenced their internal medicine final year term between March 2006 and August 2006 were included in the study with a participation rate of 100% at both faculties.

Study design and final year curriculum

The study was performed using a static-group design. Focus groups were conducted with final year students from both universities (n = 65) (Schrauth et al. 2009) and medical representatives (n = 7) of the different internal medicine departments of the University of Heidelberg in order to explore the perceived needs of final year students. Learning objectives and educational strategies were defined by the medical representatives of the internal medicine departments, as proposed by Kern et al. (Citation1998). The experimental group took part in an introduction week comprising four 2-hour sessions which focused on familiarising final year students with important working materials and procedures on the ward (Nikendei et al. Citation2006). The introduction week was followed by a curriculum consisting of four weekly sessions (Monday to Thursday), each lasting 1 hour. All four sessions in any given week were related to a specific topic () and adopted an interactive case-based learning approach (Nikendei et al. Citation2007a, b; Srinivasan et al. Citation2007). Tuesday sessions encompassed skills training and special training sessions such as ward round training (Nikendei et al. Citation2007a), communication training (Kraus et al. Citation2006) and teaching rounds on the ward. Wednesday and Thursday sessions aimed to promote electrocardiogram interpretation and pharmacological skills based on patient cases. Both experimental and comparison groups took part in a key-feature examination before and after their 3-month clerkship in internal medicine.

Table 1.  Final year student curriculum at the University of Heidelberg

Design of key-feature assessment

The range of problems sampled by each key-feature assessment was selected based on the learning objectives for the internal medicine final year rotation with general internal medicine as content domain. The Swiss one-dimensional blueprint for internal medicine was used (Bloch & Burgi Citation2002) to weight sub-domains in internal medicine in order to ensure equivalent learning objectives for both experimental and comparison groups. The main body of the key-feature examination used in this study was derived and adapted from Fischer et al. (Citation2005), whose key-feature items have already proven to be a reliable assessment tool in undergraduate internal medicine education. The development and construction of further key-feature problems was based on the steps recommended by Page et al. (Citation1995). The final version of the key-feature examination included 16 patient cases with 75 questions written in ‘short-answer write-in’ formats and ‘short menu’ formats (). Pre- and post-intervention key-feature examinations were identical. Each key-feature question was scored using a dichotomous scoring system. Accordingly, participants were able to attain assessment scores between 0 and 75. A paper and pencil format was used and 90 minutes were allowed for the examination.

Table 2.  Topics of key-feature problems used in the examination and their distribution over defined sub-domains of internal medicine (%)

Statistical analysis

In order to test for group differences in sex and age, a chi-square test and a Student's t-test for independent groups were conducted, respectively. Differences in key-feature assessment results were analysed using an ANOVA for attained scores with the between factor ‘Group’ (experimental vs. comparison) and the within factor ‘Time of Assessment’ (pre vs. post). Main effects were further investigated using post hoc tests. P values <0.050 (one-tailed) were considered to be statistically significant. The data are presented as mean ±SD. Cronbach's alpha was calculated as a test of key-feature examination reliability.

Results

Key-feature examination performance

presents the key-feature examination scores attained by the experimental and comparison groups pre- and post-intervention. An ANOVA was conducted for the attained assessment scores with the between factor ‘Group’ (experimental vs. comparison) and the within factor ‘Time of Assessment’ (pre vs. post). No significant effect of ‘Group’ was found (F(1,72) = 1.63; p < 0.206). A significant effect of ‘Time of Assessment’ (F(1,72) = 75.97; p < 0.001) was due to higher assessment scores in the post-evaluation as compared to the pre-evaluation. Furthermore, a significant ‘Group’ × ‘Time of Assessment’ interaction (F(1.72) = 10.21; p < 0.002) was observed (). Post hoc tests revealed a higher assessment score in the experimental group as compared to the comparison group in the post-intervention evaluation (p < 0.028) whereas the two groups did not differ prior to their clinical rotation (p < 0.924; ).

Figure 1. Assessment scores for experimental (EG; n = 49) and comparison groups (CG; n = 25) pre- and post-intervention (0–75 points); means and 95% confidence interval. *Post hoc test revealed higher assessment sores in the experimental as compared to the comparison group in the post-intervention evaluation (p < 0.028), although the two groups showed no difference in pre-intervention assessment scores (p < 0.924).

Figure 1. Assessment scores for experimental (EG; n = 49) and comparison groups (CG; n = 25) pre- and post-intervention (0–75 points); means and 95% confidence interval. *Post hoc test revealed higher assessment sores in the experimental as compared to the comparison group in the post-intervention evaluation (p < 0.028), although the two groups showed no difference in pre-intervention assessment scores (p < 0.924).

Table 3.  Results of key-feature examination for experimental (n = 49) and comparison group (n = 25) pre- and post-intervention (0–75 points)

Reliability of key-feature examination

Reliability was calculated using Cronbach's alpha. In the pre-intervention assessment, Cronbach's alpha was 0.71 and in the post-intervention assessment 0.75.

Discussion

This study investigated the effects of a supplementary curriculum on clinical reasoning skills in internal medicine for final year medical students. Although the number of participants in our study was limited, the students were well-matched with respect to age, sex and pre-test outcome measures. Performance was assessed using a pre-post intervention key-feature examination. Both experimental and comparison groups showed improvement in clinical reasoning after completing their internal medicine rotation. However, the experimental group attained significantly higher examination scores as compared to comparison students who conducted their clinical rotation without the additional final year curriculum, despite identical assessment scores prior to rotation.

Results show that both groups profited from final year education in terms of clinical reasoning skills. However, structured, case-based teaching and skills training sessions during final year education resulted in incremental improvement in clinical reasoning outcomes as measured by the key features examination. This result was expected based on the findings of various studies indicating that the systematic supervision of students during clinical rotations and feedback in the context of workplace learning are infrequent (Remmen et al. Citation2000; Van der Vleuten et al. Citation2000; Van Der Hem-Stokroos et al. Citation2001; Daelmans et al. Citation2004; Howley & Wilson Citation2004). A structured final year student curriculum comprising interactive case-based seminars and skills training sessions would seem to at least guarantee a minimum of standardisation in final year medical education, to address important learning objectives, and to also foster clinical reasoning skills. We speculate that more sustained facilitation of clinical reasoning skills might reinforce student motivation and engagement on the ward and in turn increase the benefits of workplace learning gained by final year students. This assumption is supported by the observation that participation in problem-based learning curricula leads to improved ratings of student clerkship performance in undergraduate training (Richards et al. Citation1996; Whitfield et al. Citation2002).

We used the key-feature examination to assess the acquisition of reasoning skills (Bordage et al. Citation1995; Page & Bordage Citation1995; Page et al. Citation1995). Key-features are defined as critical steps in the resolution of a clinical problem and focus upon the step in which examinees are most likely to make errors in resolving the problem. They capture difficult aspects of practical problem-identification and management (Page & Bordage Citation1995). Key-feature examinations assess competencies at the ‘knows-how’ level according to Miller's learning pyramid (Miller Citation1990). They have proved to be a reliable and valid approach in assessing clinical reasoning skills (Bordage et al. Citation1995; Page & Bordage Citation1995; Fischer et al. Citation2005) and to represent a feasible tool in evaluating internal medicine clerkships (Hatala & Norman Citation2002). In this study, we were able to achieve reliabilities of 0.71 and 0.75 (Cronbach's alpha). This is close to the recommended examination reliability of 0.8 for summative assessment tools (Page & Bordage Citation1995). It should be kept in mind that although key-feature examinations represent a feasible tool for the assessment of students in internal medicine clerkships, they have been shown to only modestly correlate with other measures of knowledge and clinical performance (Hatala & Norman Citation2002). Hence, it remains unclear whether a statistically significant increase in key-feature test scores is related to a significant increase in clinical performance. Studies are therefore required that address the effects of a final year curriculum such as that applied in this study using performance measures such as the MINI-CEX (clinical examination exercise; Norcini et al. Citation1995, Citation2003) and OSCE (objective structured clinical examination; Harden & Gleeson Citation1979; Wass et al. Citation2001).

Although the experimental and comparison groups performed comparably well in the pre-intervention, some inherent differences might exist between the two groups, given that these came from different medical schools. A further potential confoundation is to be seen in the fact that a specific and structured educational intervention was compared with the non-specific and unstructured clinical education provided on the ward. This difference might be related to a greater level of attention having been paid to the experimental group. Nevertheless, the final year student curriculum presented here has since been integrated into the standard education programme offered to final year students at our school. Given that it has proven to be both practicable and on account of its generalisability, we hope to encourage a transfer of the curriculum to other institutions. Follow-up studies are required which assess the confidence and performance of final year students who have participated in such a curriculum programme upon becoming newly qualified doctors.

Conclusions

Supplementary, interactive, case-based seminars and skills training sessions are effective and significantly improve the clinical reasoning skills of final year students in the field of internal medicine as measured by key-feature written examinations. Further research is needed in order to confirm the effectiveness of such a curriculum using other performance measures.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

Additional information

Notes on contributors

C. Nikendei

C. NIKENDEI, MD, the University of Heidelberg Medical Hospital, is responsible for skills-lab training, education of final year students, and the PAL programme at the Medical Hospital.

S. Mennin

S. MENNIN is a Professor Emeritus, Department of Cell Biology & Physiology Assistant Dean Emeritus, Educational Development and Research University of New Mexico School of Medicine.

P. Weyrich

P. WEYRICH, MD, the University of Tübingen Medical Hospital, is responsible for the medical education of 3–5th year medical students in Internal Medicine and the skills-lab training programme.

B. Kraus

B. KRAUS, MD, the University of Heidelberg Medical Hospital, is responsible for the medical education of final year students.

S. Zipfel

S. ZIPFEL, MD, is the professor and chairman, University of Tübingen Medical Hospital, Department of Psychosomatic Medicine and Psychotherapy.

M. Schrauth

M. SCHRAUTH, MD, the University of Tübingen Medical Hospital, is responsible for the standardized patient programme at the Faculty of Medicine and for the medical education of final year students.

J. Jünger

J. JÜNGER, MD, the University of Heidelberg Medical Hospital, is responsible for the medical education programme at the Medical Hospital.

References

  • Bloch R, Burgi H. The Swiss catalogue of learning objectives. Med Teach 2002; 24: 144–150
  • Bordage G, Brailovsky C, Carretier H, Page G. Content validation of key features on a national examination of clinical decision-making skills. Acad Med 1995; 70: 276–281
  • Bradley P, Bligh J. One year's experience with a clinical skills resource centre. Med Educ 1999; 33: 114–120
  • Clack GB. Medical graduates evaluate the effectiveness of their education. Med Educ 1994; 28: 418–431
  • Daelmans HE, Hoogenboom RJ, Donker AJ, Scherpbier AJ, Stehouwer CD, van der Vleuten CP. Effectiveness of clinical rotations as a learning environment for achieving competences. Med Teach 2004; 26: 305–312
  • Dolmans DH, De Grave W, Wolfhagen IH, van der Vleuten CP. Problem-based learning: Future challenges for educational practice and research. Med Educ 2005; 39: 732–741
  • Dolmans DH, Schmidt HG. What do we know about cognitive and motivational effects of small group tutorials in problem-based learning?. Adv Health Sci Educ Theory Pract 2006; 11: 321–336
  • Doucet MD, Purdy RA, Kaufman DM, Langille DB. Comparison of problem-based learning and lecture format in continuing medical education on headache diagnosis and management. Med Educ 1998; 32: 590–596
  • Fischer MR, Kopp V, Holzer M, Ruderich F, Junger J. A modified electronic key feature examination for undergraduate medical students: Validation threats and opportunities. Med Teach 2005; 27: 450–455
  • Fox RA, Ingham Clark CL, Scotland AD, Dacre JE. A study of pre-registration house officers’ clinical skills. Med Educ 2000; 34: 1007–1012
  • Goldacre MJ, Lambert T, Evans J, Turner G. Preregistration house officers' views on whether their experience at medical school prepared them well for their jobs: National questionnaire survey. BMJ 2003; 326: 1011–1012
  • Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ 1979; 13: 41–54
  • Hatala R, Norman GR. Adapting the key features examination for a clinical clerkship. Med Educ 2002; 36: 160–165
  • Howley LD, Wilson WG. Direct observation of students during clerkship rotations: A multiyear descriptive study. Acad Med 2004; 79: 276–280
  • IMPP. 2007. Prüfungsergebnisse im Überblick. http://www.impp.de/index.php?id=9, 2007.
  • Jünger J, Schafer S, Roth C, Schellberg D, Friedman Ben-David M, Nikendei C. Effects of basic clinical skills training on objective structured clinical examination performance. Med Educ 2005; 39: 1015–1020
  • Kern DE, Thomas PA, Bass EB, Howard DM. Curriculum development for medical education: A six-step approach. Johns Hopkins University Press, London 1998
  • Kneebone R, Kidd J, Nestel D, Asvall S, Paraskeva P, Darzi A. An innovative model for teaching and learning clinical procedures. Med Educ 2002; 36: 628–634
  • Kraus B, Briem S, Jünger J, Schrauth M, Weyrich P, Herzog W, Zipfel S, Nikendei C. Development and evaluation of a training scheme for final year students in internal medicine. GMS Zeitschrift für Medizinische Ausbildung 2006; 23: Doc 70
  • Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65: S63–S67
  • Nikendei C, Kraus B, Lauber H, Schrauth M, Weyrich P, Zipfel S, Jünger J, Briem S. An innovative model for teaching complex clinical procedures: Integration of standardised patients into ward round training for final year students. Med Teach 2007a; 29: 246–252
  • Nikendei C, Kraus B, Schrauth M, Weyrich P, Zipfel S, Herzog W, Jünger J. Integration of role-playing into technical skills training: A randomized controlled trial. Med Teach 2007b; 29: 956–960
  • Nikendei C, Kraus B, Schrauth M, Weyrich P, Zipfel S, Jünger J. An innovative model for final-year students’ skills training course in internal medicine: Essentials from admission to discharge. Med Teach 2006; 28: 648–651
  • Nikendei C, Zeuch A, Dieckmann P, Roth C, Schafer S, Volkl M, Schellberg D, Herzog W, Jünger J. Role-playing for more realistic technical skills training. Med Teach 2005; 27: 122–126
  • Norcini JJ, Blank LL, Arnold GK, Kimball HR. The mini-CEX (clinical evaluation exercise): A preliminary investigation. Ann Intern Med 1995; 123: 795–799
  • Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: A method for assessing clinical skills. Ann Intern Med 2003; 138: 476–481
  • Norman G. Research in clinical reasoning: Past history and current trends. Med Educ 2005; 39: 418–427
  • Page G, Bordage G. The medical council of Canada's key feature project: A more valid written examination of clinical decision-making skills. Acad Med 1995; 70: 104–110
  • Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills. Acad Med 1995; 70: 194–201
  • Remmen R, Denekens J, Scherpbier A, Hermann I, van der Vleuten C, Royen PV, et al. An evaluation study of the didactic quality of clerkships. Med Educ 2000; 34: 460–464
  • Remmen R, Derese A, Scherpbier A, Denekens J, Hermann I, van der Vleuten C, Royen PV, Bossqert L. Can medical schools rely on clerkships to train students in basic clinical skills?. Med Educ 1999; 33: 600–605
  • Remmen R, Scherpbier A, van der Vleuten C, Denekens J, Derese A, Hermann I, et al. Effectiveness of basic clinical skills training programmes: A cross-sectional comparison of four medical schools. Med Educ 2001; 35: 121–128
  • Richards BF, Ober KP, Cariaga-Lo L, Camp MG, Philp J, McFarlane M, Rupp R, Zaccaro DJ. Ratings of students' performances in a third-year internal medicine clerkship: A comparison between problem-based and lecture-based curricula. Acad Med 1996; 71: 187–189
  • Schmidt HG, Machiels-Bongaerts M, Hermans H, ten Cate TJ, Venekamp R, Boshuizen HP. The development of diagnostic competence: Comparison of a problem-based, an integrated, and a conventional medical curriculum. Acad Med 1996; 71: 658–664
  • Schrauth M, Weyrich P, Kraus B, Jünger J, Zipfel S, Nikendei C. Workplace learning in final year medical education: A comprehensive analysis of students’ expectancies and experiences. Z Evid Qual Gesundhwes 2009; 103: 169–174
  • Shumway JM, Harden RM. AMEE Guide No. 25: The assessment of learning outcomes for the competent and reflective physician. Med Teach 2003; 25: 569–584
  • Srinivasan M, Wilkes M, Stevenson F, Nguyen T, Slavin S. Comparing problem-based learning with case-based learning: Effects of a major curricular shift at two institutions. Acad Med 2007; 82: 74–82
  • Van Der Hem-Stokroos HH, Scherpbier AJ, van Der Vleuten CP, De Vries H, Haarman HJ. How effective is a clerkship as a learning environment?. Med Teach 2001; 23: 599–604
  • Van der Vleuten CP, Scherpbier AJ, Dolmans DH, Schuwirth LW, Verwijnen GM, Wolfhagen HA. Clerkship assessment assessed. Med Teach 2000; 22: 592–600
  • Wass V. Ensuring medical students are ‘fit for purpose’. BMJ 2005; 331: 791–792
  • Wass V, van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet 2001; 357: 945–949
  • Whitfield CF, Mauger EA, Zwicker J, Lehman EB. Differences between students in problem-based and lecture-based curricula measured by clerkship performance ratings at the beginning of the third year. Teach Learn Med 2002; 14: 211–217

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.