4,067
Views
34
CrossRef citations to date
0
Altmetric
Research Article

E-learning and deliberate practice for oral case presentation skills: A randomized trial

, , , , , , , & show all
Pages e820-e826 | Published online: 30 Aug 2012

Abstract

Background: Oral case presentations are critical for patient care and student assessment. The best method to prepare early medical students for oral presentations is unknown.

Aim: We aimed to develop and evaluate a curriculum of on-line learning and deliberate practice to improve pre-clinical students’ case presentation skills.

Methods: We developed a web-based, interactive curriculum emphasizing conciseness and clinical reasoning. Using a waitlist control design, we randomly assigned groups of second-year students to receive the curriculum in December 2010 or in April 2011. We evaluated their presentations at three time points. We also examined the performance of an untrained class of students as a historical comparison.

Results: We evaluated 132 second-year medical students at three time points. After the curriculum, mean scores of the intervention students improved from 60.2% to 70.1%, while scores of the waitlist control students improved less, from 61.8% to 64.5% (p < 0.01 for between-group difference in improvement). Once all students had received the curriculum, mean scores for the intervention and waitlist control students rose to 77.8% and 78.4%, respectively, compared to 68.1% for the untrained comparison students (p < 0.0001 compared to all curriculum students).

Conclusion: An on-line curriculum followed by deliberate practice improved students’ oral presentation skills.

Introduction

The oral case presentation is an essential clinical skill. A competent presentation allows for efficient transfer of information between health care professionals (Donnelly Citation1997; Association of American Medical Colleges Citation2008).Such communication is increasingly important as providers perform more hand-offs and team-based care (Arora et al. Citation2005; Horwitz et al. Citation2008).Case presentations are also a principal tool for student assessment, and they serve as a proxy for clinical reasoning (Lingard et al. Citation2002; Wiese et al. Citation2002; Ottolini et al. Citation2007).

Medical students seek a rule-based structure for oral presentations. They are often confused or frustrated by what they perceive as capricious faculty expectations (Haber & Lingard Citation2001). Further, evaluators may use superficial criteria, such as a student's personality, to rate presentation skills (Kihm et al. Citation1991; Elliot & Hickam Citation1997).

Surveys show that evaluators have consistent expectations for the competent oral presentation (Green et al. Citation2009, Citation2011). But there are few effective curricula described in the medical education literature which reliably ensure students are trained to meet these expectations. In one study, a clinical reasoning curriculum improved the oral presentation skills of third-year students on a medicine clerkship (Wiese et al. Citation2002). Other studies of interventions to improve presentations skills have shown mixed results for the outcome of summative end-of-clerkship evaluations; however, summative ratings may have poor accuracy and reliability for assessing specific skills (Green et al. 2005; Kim et al. Citation2005). No study has described a curriculum designed for pre-clinical students, although early introduction of clinical reasoning has been advocated (Kassirer Citation2010).

On-line learning, or e-learning, has a growing role in medical education because it is durable and re-usable, allows for standardized content delivery, requires interaction between each learner and the content, and permits asynchronous learning, such that learners may access the material at different times and places (Ruiz et al. Citation2006).

Once learners acquire basic knowledge, deliberate practice of skills has been shown to help them achieve mastery (Ericsson Citation2006). Key features of deliberate practice are: (1) engaging motivated learners in well-defined tasks via focused, repetitive practice; (2) rigorously measuring performance to provide feedback and encourage learners to correct errors through repeated practice; and (3) evaluating learners to certify mastery and promote them to the next task (McGaghie et al. Citation2011). In clinical medicine, deliberate practice has previously been applied mainly to procedural and physical exam skills, such as advanced cardiac life support (Wayne et al. Citation2008), central line insertion (Barsuk et al. Citation2009), and cardiac auscultation (Butter et al. Citation2010).

In this report, we describe the development of a curriculum of on-line learning and deliberate practice for oral case presentation skills. We also present outcomes data from a randomized trial demonstrating the efficacy of this curriculum in a cohort of second-year medical students.

Method

Phase 1: Curriculum design

Setting

The M2 Clinical Skills course is a year-long class on clinical diagnosis for second-year students at Northwestern University Feinberg School of Medicine. It consists of two main components: a bimonthly didactic portion in the standardized patient center, and a bimonthly clinical preceptorship with either an outpatient preceptor or a hospitalist. Prior to implementing this curriculum, oral presentations were introduced in readings and a lecture. Students were then required to do one oral presentation of a real patient to their clinical preceptor and encouraged to do more than one; additionally, they brought real patient presentations to didactic sessions to present to a fourth-year tutor on one or two occasions.

Case development

To isolate the task of case presentation from the task of data gathering, we created a set of cases for students to use as the basis for their presentations. We filmed 10 physicians interviewing standardized patients with a range of chief concerns, and we created written physical examination reports to accompany the videos. Chief concerns for all cases were common primary care symptoms which might relate to more than one organ system. We intended that six of these cases would be used as assessment cases and four would be used as teaching cases.An example of one of the video case histories can be found on-line at http://simulation.northwestern.edu/videos/RobertMorris.html.

Checklist development

After reviewing oral presentation evaluation tools and key components described in the published literature (Wiese et al. Citation2002; Green et al. 2005; Kim et al. Citation2005) and on-line (McGee Citation2012), the first and second authors (HH and TU) conducted a systematic checklist development process (Stufflebeam Citation2000) to create a general checklist for the key components of an oral case presentation ().This checklist was reviewed for content by the other co-authors and by several medical educators not involved in the study. Using this general checklist, we then developed content-specific checklists for each assessment case and circulated each of them to two or three clinician-educators across the country for feedback. The content-specific checklists emphasized clinical reasoning. Students were required to include pertinent information and exclude irrelevant information to promote conciseness. For example, where the generic checklist states that the physical exam must describe all pertinent organ systems and exclude irrelevant ones, the pelvic pain checklist requires that particular abdominal and pelvic exam findings must be included and that musculoskeletal and neurological findings must be excluded. All items were scored dichotomously, as either “done” or “not done.”

Table 1  Northwestern oral presentations checklist.a

E-learning module

We created an interactive on-line module using e-learning software (Articulate Global Inc, New York, NY).The module introduces each section of an oral presentation: the opening statement, history of present illness, additional medical history, physical exam, and assessment and plan. For each section of the oral presentation, students learn general information by clicking on headers on the screen. The module also includes a video interview of two patients – one with vertigo and one with chest pain – along with their full physical exams. After learning general information about each section of a presentation, students watch a poor presentation of the vertigo case, then answer questions about what made that presentation poor. The correct answers are provided automatically with explanations, then students see an ideal presentation. At the end of the module, using the chest pain case, students repeat the process of critiquing a poor presentation through quiz questions, then watching an ideal presentation. The module, which was designed to be completed in about two hours, required students to engage with all the material on a slide before progressing and to submit all quiz questions before moving forward.

Deliberate practice

We selected two cases, one on shoulder pain and one on palpitations, as practice cases. The module directed students to spend no more than two hours constructing oral presentations for these cases and to bring these presentations to a didactic session. As part of their teaching requirement, 17 fourth-year students were oriented to the checklists for these two cases during a one-hour session. They were instructed to serve as “coaches” for the second-year students, giving feedback on specific content and style points on the checklists which the students had missed or completed. Second-year students presented a case to their coach for 5–10 min and then received 5–10 min of specific checklist-based feedback. After this first presentation, they were allowed one hour to watch their presentation on video and edit their second case before presenting that case to the same fourth-year coach for additional feedback.

Phase 2: Curriculum evaluation

Study design

We conducted a randomized trial from October 2010 to May 2011 to determine whether the e-learning and deliberate practice curriculum would improve second-year medical students’ oral case presentations. To provide all students the potential benefit of the intervention, we used a waitlist control group with crossover (Shadish et al. Citation2002) ().For the primary outcome, we evaluated oral presentations on three occasions: at baseline, at the midpoint of the second year, and after group crossover on completion of the second year (final evaluation).We also assessed the performance of a group of untrained students on completion of their second year (May 2010) to use as a secondary, historical comparison.

Figure 1. Flow diagram from waitlist control study.

Figure 1. Flow diagram from waitlist control study.

Participants

Upon matriculation, students are assigned randomly to one of four colleges for their Patient, Physician and Society curriculum. We randomly assigned two colleges (the intervention group) to learn oral presentation skills in winter and two colleges (the waitlist control group) to learn them in the spring ().The Northwestern University Institutional Review Board approved this study, and all participants provided informed consent.

Measurement

For the implementation year of the curriculum, all second-year students were randomly assigned a case from the bank of six assessment cases at baseline, at midpoint (after the intervention group had received the curriculum), and finally for their end-of-second-year observed structured clinical examination ().Chief concerns for the six cases were abdominal pain, edema, low back pain, pelvic pain, shortness of breath, and headache. Students could receive any of the six assessment cases at any time point; however, no student was permitted to present the same case twice.

All students in both groups could use the generic checklist () at all assessment time points.

The primary outcome was the improvement in performance on a checklist-based assessment between the baseline and midpoint assessment. Checklists included all content-specific items from , as well as presentation timing and order. Depending on the content of the case, some content-specific items were not applicable, so the length of the checklists varied from 19 to 22 items. Evaluation scores were calculated as the percentage of correct responses from the checklist items.

Student presentations were video-recorded, and videos were rated by a group of eight trained fourth-year students who had completed an orientation to the checklist and calibrated their ratings of at least two cases with those of the first author. Raters were paid per case completed. They were blinded to the training status of students but not to the timing of the evaluation.

As part of a general review of the M2 Clinical Skills course, we surveyed intervention students after the module and after the deliberate practice using an on-line questionnaire. The survey included Likert-type questions with 1 = strongly disagree or poor and 5 = strongly agree or excellent.

We also compared the final evaluation scores of the intervention and waitlist control students against a historical comparison group, the second-year students from the year preceding the curriculum implementation. Using a random number generator, these students were assigned a single random assessment case in the spring of their second year. For each case, a sample of 50–100% of the historical control checklists was re-scored by a second rater to assess inter-rater reliability.

Data analysis

Inter-rater reliability was estimated by calculating Cohen's kappa averaged across all content-specific items for each of the six assessment case checklists. We used SAS Enterprise Guide 4.3 software (SAS Institute, Inc., Cary, NC) to compare the difference in baseline and post-test scores between the intervention and the control groups using the t-test. We compared the average score of the historical control group and the implementation class on the final assessment also using a t-test. We used a two-sided alpha <0.05 to indicate significance. Sample size was limited by class size, but group sizes provided us 80% power to detect a 0.5 standard deviation (SD) difference between groups.

Results

In the implementation class of 166 students, seven students did not provide consent for data collection, 19 students did not complete one of the cases to which they were randomized, three students had at least one video not available for technical reasons, and five students took a leave of absence before the last case, leaving 65 students in the waitlist control group and 67 students in the intervention group to assess for the primary outcome (). Seventy-seven students in the intervention group provided anonymous feedback about the curriculum.

There were 167 students in the historical comparison group. Two students took a leave of absence, 17 students did not provide consent for their data to be used in a study, four students had unavailable videos for technical reasons, and one student did not present the case assigned, leaving 143 students available to assess.

Inter-rater reliability across the content-specific portions of the checklists was substantial, ranging from 0.61 for the pelvic pain case to 0.82 for the shortness of breath case.

At the midpoint evaluation, when only the intervention group had received the curriculum, mean scores improved from 60.2% to 70.1% (9.9% improvement, SD 15.9%) for the intervention group and from 61.8% to 64.5% (2.7% improvement, SD 15.7%) for the control group (). This 7.2% difference in the improvement between intervention and control groups reached statistical significance at p < 0.01.

Figure 2. Overall scores for intervention, waitlist control, and historical comparison groups at three time points. Notes: *p<0.01 for difference in improvement between waitlist control and intervention from baseline to midpoint evaluation. **p<0.0001 for difference between historical comparison and implementation class at final evaluation.

Figure 2. Overall scores for intervention, waitlist control, and historical comparison groups at three time points. Notes: *p<0.01 for difference in improvement between waitlist control and intervention from baseline to midpoint evaluation. **p<0.0001 for difference between historical comparison and implementation class at final evaluation.

Students receiving the curriculum at any point scored significantly higher on their final evaluation than did students in the historical comparison group. At the final evaluation in May, the intervention group demonstrated an overall mean of 77.8% (SD 11.9%) and the waitlist control group a mean of 78.4% (SD 10.7%, p = NS). These scores were both significantly higher than the historical comparison mean of 68.1% (SD 11.2%, p < 0.001 for comparison to both implementation groups) ().

Seventy-five students responded to questions about the curriculum following the module and deliberate practice (). After the coaching, students increased their agreement that they felt more prepared to do oral presentations following the curriculum (mean 4.3, SD 0.49). With 1 = poor and 5 = excellent, students rated the overall curriculum a mean of 4.0 (SD 0.81) (very good), with the deliberate practice (mean 4.1, SD 0.79) preferred over the module (mean 3.5, SD 1.2).

Table 2  Intervention students’ ratings of e-learning module and deliberate practice

Discussion

Using contemporary educational techniques of on-line learning and deliberate practice, we developed a comprehensive, standardized curriculum which significantly improved students’ oral case presentation skills. This finding appears robust, because we demonstrated benefit of the curriculum at two points: the midpoint of the second year, through the waitlist control design, and the final evaluation, through the use of a historical comparison. In fact by the middle of second year, our intervention students exhibited presentation skills superior to those of end-of-second year historical controls.

This study expands on prior research about improving oral presentations (Wiese et al. Citation2002; Green et al. Citation2005; Kim et al. Citation2005). It is unique in its pedagogy, including an on-line component; its assessment methodology, using reliable checklist-based evaluation tools; and its audience, students who have not yet begun clinical rotations.

The curriculum's significant on-line component can reduce faculty lecture time, ensure the content will be standardized and reusable, and allow students in diverse locations to complete this portion. The assessment of students was done using rigorously developed and reliable checklists, which helps to address students’ worries that their oral presentation evaluations are exclusively subjective. The checklists may further save faculty time by allowing non-physician “coaches,” such as our fourth-year students, to provide standard feedback based on the clinical reasoning of the faculty case writers.

The success of the curriculum supports the idea that early medical students are ready to learn clinical reasoning. We required students to use clinical reasoning to select the pertinent case content to present and to offer a differential diagnosis and argument. Because the history and physical examination was provided for the student, we were able to separate the skill of presenting from that of data gathering. Additionally, our students demonstrated that they could generalize their presentation skills by using practice on several topics (shoulder pain and palpitations) to improve their performance on an entirely different topic (e.g., leg edema).This helps demonstrate that success at the case presentation relies on process skills, not only content knowledge (Eva et al. Citation1998). We attribute the continued improvement of the intervention group between the midpoint and final assessments to further gains in content knowledge as a result of completing their basic second-year pathophysiology course and clinical preceptorship.

This study lends support for the use of the “flipped” or “inverted” classroom in medical education in which the on-line module and deliberate practice are part of an integrated pedagogical approach, saving classroom time for active learning activities (Prober & Heath Citation2012). Indeed, a meta-analysis of studies about on-line learning found that blended learning, where on-line activities are combined with face-to-face ones, is significantly better than face-to-face learning alone, while on-line courses alone are not significantly better than face-to-face learning (US Department of Education, Citation2010).

The study has several limitations. It was performed at only one institution. It also requires the availability of one-on-one coaching for deliberate practice, but as noted, non-physicians with medical experience can serve successfully as coaches. Due to time constraints, only a single case could be used for each individual student at each evaluation point. The known limitation of single-case testing, however, applies more to an individual student's score than to the scores of a whole population. The end-of-year evaluation for the implementation class was a summative evaluation, while the end-of-year evaluation was formative for the historical comparison group. However the M2 Clinical Skills course is pass–fail, and we believe that students put their best effort into all evaluations. Lastly, contamination of the waitlist control group could have occurred if they were exposed to the curriculum content by peers or by accessing the on-line module. Contamination would have led the study to confirm the null hypothesis, so if it occurred, our actual curriculum effect might be even stronger.

Next steps for this curriculum include setting mastery performance standards for each case and determining whether students who do not reach the standard will be able to do so with additional deliberate practice. Second, we would like to show sustained improvement in students’ oral presentation skills as they move to the clinical environment and present actual patients. Finally, we need to conduct faculty development so that our teaching attending physicians are well poised to reinforce the lessons of the curriculum.

Conclusion

In summary, this study demonstrates that a curriculum of on-line learning followed by deliberate practice improved performance on a reliable assessment of oral case presentation skills. The curriculum is sustainable, potentially exportable, and can be used to ensure that early medical students demonstrate competence in the case presentation before moving to the clinical environment.

Acknowledgments

The authors wish to thank Allison Hammer and Marsha Kaye from the Augusta Webster Office of Medical Education at Northwestern University Feinberg School of Medicine for their work in the implementation of this curriculum. Drs Diane Wayne and Joe Feinglass of the Department of Medicine at Northwestern provided essential guidance in the study design and analysis. The faculty and staff of Northwestern University's Searle Center for Teaching Excellence assisted in the initial development of the project. Drs Katie Kinner, Jillian Havey Swary, Jill Larson, Jennifer Kaplan, Robert Eilers, Manali Bhave, Meghan Bhave, and Daniel Fuchs were invaluable as our fourth-year student raters.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of this article.

This study was supported by a grant from the Barney Family Foundation. Dr McGaghie's contribution was supported in part by the Jacob R. Suker, MD, Professorship in Medical Education at Northwestern University and by grant UL 1 RR025741 from the National Center for Research Resources, National Institutes of Health. The National Institutes of Health had no role in the preparation, review, or approval of the manuscript.

References

  • Arora V, Johnson J, Lovinger D, Humphrey HJ, Meltzer DO. Communication failures in patient sign-out and suggestions for improvement: A critical incident analysis. Qual Saf Health Care 2005; 14: 401–407
  • Association of American Medical Colleges. 2008. Recommendations for clinical skills curricula for undergraduate medical education. [Accessed 13 September 2011] Available from https://www.aamc.org/download/130608/data/clinicalskills_oct09.qxd.pdf.pdf
  • Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Int Med 2009; 159: 1420–1423
  • Butter J, McGaghie WC, Cohen ER, Kaye M, Wayne DB. Simulation-based mastery learning improves cardiac auscultation skills in medical students. J Gen Intern Med 2010; 25: 780–785
  • Donnelly WJ. The language of medical cases histories. Ann Intern Med 1997; 127: 1045–1048
  • Elliot DL, Hickam DH. How do faculty evaluate students’ case presentations?. Teach Learn Med 1997; 9: 261–263
  • Ericsson KA. The influence of experience and deliberate practice on the development of superior expert performance. The Cambridge handbook of expertise and expert performance, KA Ericsson, N Charness, PJ Feltovich, RR Hoffman. Cambridge University Press, New York, NY 2006; 683–703
  • Eva KW, Neville AJ, Norman GR. Exploring the etiology of content specificity: Factors influencing analogic transfer and problem solving. Acad Med 1998; 73: S1–S5
  • Green EH, Hershman W, DeCherrie L, Greenwald J, Torres-Finnerty N, Wahi-Gururaj S. Developing and implementing universal guidelines for oral patient presentation skills. Teach Learn Med 2005; 17: 263–267
  • Green EH, DeCherrie L, Fagan MJ, Sharpe BA, Hershman W. The oral case presentation: What internal medicine clinician-teachers expect from clinical clerks. Teach Learn Med 2011; 23: 58–61
  • Green EH, Durning SJ, DeCherrie L, Fagan MJ, Sharpe B, Hershman W. Expectations for oral case presentations for clinical clerks: Opinions of internal medicine clerkship directors. J Gen Intern Med 2009; 24: 370–373
  • Haber RJ, Lingard L. Learning oral presentation skills: A rhetorical analysis with pedagogical and professional implications. J Gen Intern Med 2001; 16: 308–314
  • Horwitz L, Moin T, Krumholz HM, Wang L, Bradley EH. Consequences of inadequate sign-out for patient care. Arch Intern Med 2008; 168: 1755–1760
  • Kassirer JP. Teaching clinical reasoning: Case-based and coached. Acad Med 2010; 85: 1118–1124
  • Kihm JT, Brown JT, Divine GW, Linzer M. Quantitative analysis of the outpatient oral case presentation: Piloting a method. J Gen Intern Med 1991; 6: 233–236
  • Kim S, Kogan JR, Bellini LM, Shea JA. A randomized-controlled study of encounter cards to improve oral case presentation skills. J Gen Intern Med 2005; 20: 743–747
  • Lingard L, Garwood K, Schryer CF, Spafford MM. A certain art of uncertainty: Case presentation and the development of professional identity. Soc Sci Med 2002; 56: 603–616
  • McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 2011; 86: 706–711
  • McGee S, 2012. Oral presentation guidelines. [Accessed 18 May 2012] Available from https://depts.washington.edu/medclerk/drupal/pages/Oral-Presentation-Guidelines
  • Ottolini MC, Cuzzi S, Tender J, Coddington DA, Focht C, Patel KM, Greenberg L. Decreasing variability in faculty ratings of student case presentations: A faculty development intervention focusing on reflective practice. Teach Learn Med 2007; 19: 238–243
  • Prober CG, Heath C. Lecture halls without lectures – A proposal for medical education. New Engl J Med 2012; 366: 1658–1659
  • Ruiz J, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med 2006; 81: 207–212
  • Shadish WR, Cook TD, Campbell DT. Experimental and quasi-experimental designs for generalized causal inference. Houghton Mifflin, Boston, MA 2002
  • Stufflebeam DL, 2000. Guidelines for developing evaluation checklists: The Checklists Development Checklist (CDC). [Accessed 11 July 2011] Available from http://www.wmich.edu/evalctr/archive_checklists/guidelines_cdc.pdf
  • US Department of Education, 2010. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. [Accessed 25 May 2012] Available from http://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf
  • Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case-control study. Chest 2008; 133: 56–61
  • Wiese J, Varosy P, Tierney L. Improving oral presentation skills with a clinical reasoning curriculum: A prospective controlled study. Am J Med 2002; 112: 212–218

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.