1,310
Views
25
CrossRef citations to date
0
Altmetric
Web Paper

Large group high-fidelity simulation enhances medical student learning

, , &
Pages e206-e210 | Published online: 21 Jul 2009

Abstract

Background: Previous work shows feasibility for large group high-fidelity simulation with correlation to basic science in the preclinical curriculum.

Aims: This project studies whether large group simulation leads to enhanced basic science learning.

Methods: This was an educational performance study before and after high-fidelity simulation for first-year medical students. Basic neuroscience concepts were reinforced with simulation, and pretesting and posttesting were analysed along with summative exam results. The number correct was compared on a contingency table using the Mantel–Haenszel chi-squared test and same student correlation was accounted for with a 'Generalized Estimating Equations’ model.

Results: The study included 112 students; three were excluded for missing data. Students showed statistically significant improvement on two of the four questions, and a nonsignificant improvement or equivalent performance on two questions. Students were significantly more likely to get all four responses correct on the posttest than on the pretest. Summative testing 11 days later had >80% correct responses for three factual recall questions and 58% correct responses for a single knowledge application question.

Conclusions: Simulation is an effective teaching method for preclinical basic science education. Students demonstrated significant improvements after participating in a live interactive simulation scenario.

Introduction

A major challenge faced by undergraduate medical students is the integration of basic science knowledge with patient evaluation and management skills. The use of high-fidelity simulation as a clinical teaching method in medical education has become an important part of post-graduate physician learning at many institutions (Bond & Spillane Citation2002; Morgan & Cleave-Hogg Citation2002; Vozenilek et al. Citation2006). Medical student participation in such programs has been reported primarily during the clinical portions of the curriculum and in small-group formats (Gordon Citation2000; Weller Citation2004; McMahon et al. Citation2005; MacDowall Citation2006; Morgan et al. Citation2006). A simulation environment allows medical students to experience the stress and responsibility of acute care without risk to patients or themselves. Medical students who have participated in such simulations generally consider the experience very valuable (Gordon Citation2000; Weller Citation2004).

Some institutions have begun to examine the use of simulation to aid basic science teaching among preclinical medical students (Euliano Citation2001; Tan et al. Citation2002; Koniaris et al. Citation2004; Gordon et al. Citation2004, Citation2006; Fitch Citation2007). Gordon et al. (Citation2006) found that brief exposure to a simulation of myocardial infarction with congestive heart failure enhanced first-year medical students’ basic science knowledge of cardiovascular physiology. Other researchers have focused on topics such as shock physiology (Koniaris et al. Citation2004) and cardiovascular physiology (Euliano Citation2001; Tan et al. Citation2002). Many of these studies, however, have been conducted with small groups of participants, and these faculty-intensive exercises can be time-consuming and expensive. This study uses a live large-group interactive simulation scenario to enhance learning of basic neuroscience topics (Fitch Citation2007). The primary goal of this study was to determine whether the use of high-fidelity simulation can enhance medical student learning.

Methods

Study design

Study participants for this before and after study were first-year medical students taking part in the neuroscience portion of the basic science curriculum at Wake Forest University School of Medicine. The 112 participants were divided into two groups of approximately 56 students each during a live and interactive emergency medicine simulation presentation lasting for 90 min. Not only did this serve to enhance the potential for interactivity among individuals, but also the event was coordinated with their schedule during a time when the class was already in two groups for another component of the curriculum. Each of the two presentations was identical, with the same faculty and physician actors presenting a scripted scenario with predetermined end-points. This study was reviewed by our Institutional Review Board and approved as an exempt study with waiver of informed consent.

The live simulation presentation used a clinical scenario to highlight basic neuroscience concepts, as previously described in a pilot feasibility study (Fitch Citation2007). These autonomic nervous system topics had already been presented to participants several days earlier in a traditional lecture setting by a PhD faculty member from the Department of Physiology and Pharmacology who was unaware of the ongoing study. A Laerdal SimMan™ was transported from the simulation center to the medical school lecture hall where a prerecorded EMS radio call announced the arrival of the fully clothed simulation mannequin. Resident physician actors portrayed EMS providers, nurses, and family members. Student volunteers ran the case as emergency physicians and patient management decisions were guided by class input. The clinical scenario was enhanced with group discussion of the relevant basic science mechanisms underlying the autonomic nervous system, neurotransmitters, receptors, and neuropharmacology. Video highlights at http://www.EmergencySimulation.com demonstrate the teaching intervention from these 90 min interactive simulation sessions. The study author leading the simulation session and group discussions for both sessions is a practicing emergency physician and a Diplomate of the American Board of Emergency Medicine with a background and graduate-level training in basic neuroscience research.

Measurements

The primary outcome of this study was performance on a four question multiple-choice pretest compared to a posttest completed by students after participating in a simulation session. During the class session preceding the simulation session, participants were given a four-question multiple-choice answer pretest by one of the faculty study authors addressing basic neuroscience concepts that were previously covered in class 2 days earlier by a faculty member not involved in this study. Answers to these questions were not provided to the students, and they were not aware that a posttest would later be administered. Three days later (upon return from a scheduled holiday weekend), the simulation teaching session was presented to the students. Immediately following this session, the same four-question test was administered as a posttest by a second faculty study author, and a verbal discussion of correct answers was conducted at that time. Pre- and posttesting was conducted through a real-time, wireless electronic response system on a web server accessed via laptop computer by each participant, and responses were analyzed anonymously. Participants also filled out anonymous feedback forms. This feedback was measured by participants’ level of agreement with each of five statements using a five-point Likert scale, with one being equal to “Disagree” and five being equal to “Agree”, and an accompanying free text comments section. These questions collected student feedback regarding the correlation of the simulation to basic science concepts, the use of simulation in the presentation, facilitator effectiveness, enhanced understanding of basic neuropharmacology, and the value of simulation in the basic science curriculum.

Data analysis

The number correct for pre- and posttesting was compared on a contingency table using the Mantel–Haenszel chi-squared test. Correlation between pre- and posttest scores from the same groups of students was accounted for using a Generalized Estimating Equations (GEE) model in the SAS GENMOD procedure (Hanley et al. Citation2003). This model was used to compare whether or not all of the four questions, as well as each individual question, were answered correctly. Odds ratios were calculated to determine the strength of the relationship.

Results

A total of 112 participants took the summative examination and were therefore included in the anonymous performance data evaluated in this study. Data analysis of the pre- and posttest data was performed using data from 109 participants; three participants were excluded from the analysis due to missing pretest or posttest data.

Participants showed improvement from the pretest to the posttest, and each of the four questions had >90% correct responses on the posttest for all the 109 students. Statistically significant improvements were found on two of the four questions between pre- and posttesting (). Question 1 improved from 84 correct responses (77.1%) to 108 correct (99.1%), a statistically significant difference (Odds Ratio [OR] 32.9; 95% confidence interval [CI] 4.8–226.2). Question 4 improved from 85 correct responses (78.1%) to 104 correct (95.4%), which is also a statistically significant difference (OR 5.7; CI 2.3–14.3). Question 2 had more correct responses in the posttest (104 correct, 95.4%) when compared to the pretest (102 correct, 93%), but these findings did not reach statistical significance (OR 1.4; CI 0.6–3.4). Question 3 had the same number of correct answers (99 correct, 90.8%) on the pretest and posttest. Students were significantly more likely to get all the four responses correct on the posttest than on the pretest (OR 4.03; CI 2.31–7.03), with 83% of participants answering all the four questions correctly on the posttest, as compared to only 55% of participants on the pretest ().

Figure 1. Number of correct answers on pretest before simulation and immediately after the event for the posttest. Statistically significant differences (*) were found for Question 1 (OR 32.9; 95% CI 4.8–226.2) and Question 4 (OR 5.7; CI 2.3–14.3). Questions 2 and 3 did not demonstrate statistically significant improvements.

Figure 1. Number of correct answers on pretest before simulation and immediately after the event for the posttest. Statistically significant differences (*) were found for Question 1 (OR 32.9; 95% CI 4.8–226.2) and Question 4 (OR 5.7; CI 2.3–14.3). Questions 2 and 3 did not demonstrate statistically significant improvements.

Figure 2. Increases in total number of correct answers. Students were significantly more likely to get all four questions correct on the posttest than on the pretest (OR 4.03; CI 2.31–7.03).

Figure 2. Increases in total number of correct answers. Students were significantly more likely to get all four questions correct on the posttest than on the pretest (OR 4.03; CI 2.31–7.03).

The anonymous feedback collected immediately following the posttest revealed that all participants gave either a 4 or 5 out of the 5 in agreement that the concepts presented in the case correlated well with the basic science concepts they had learned in class, and 97% similarly agreed that the simulation presentation enhanced their understanding of the basic science concepts presented ().

Figure 3. Feedback summary of participants’ self-perception of learning. (1 = Disagree; 5 = Agree).

Figure 3. Feedback summary of participants’ self-perception of learning. (1 = Disagree; 5 = Agree).

Discussion

High-fidelity simulation in a large-group setting has been demonstrated to be a feasible way to teach basic science concepts (Fitch Citation2007), and incorporates practices based on both hierarchical and contextual theories of learning (Pring Citation1970; Charlin et al. Citation2000; Harden Citation2000). This method avoids “inert knowledge” (disconnected information or concepts) by integrating the basic science within a clinical scenario, an approach that is supported by the use of trans-disciplinary approaches to teaching that avoid artificial disconnects between various subjects within medical education (Whitehead Citation1929; Harden Citation2000). As has been reported in other studies of simulation, feedback from participants in this study was overwhelmingly positive, and 97% of students reported a 4 or 5 out of the 5 in agreement that our experience enhanced their learning of basic science concepts. This perception by the students was supported by the data analyzed in pre- and posttesting and student performance on a summative examination.

The four pre- and posttest questions were drafted by two of the faculty study authors to evaluate basic factual material that participants were expected to have learned as part of the traditional lecture-based curriculum. Thus, it was anticipated that participants would likely perform well as a group on these questions. This was, in fact, the case as just over half of the participants correctly answered all four questions on the pretest.

Statistically significant improvement from pre- to posttest was seen on questions 1 and 4. Both of these questions required participants to understand the difference between nicotinic and muscarinic acetylcholine receptors, a possible source of confusion for students that may have made these two questions similarly challenging. The significant improvement seen on these questions demonstrates the effectiveness of the live, large group simulation exercise in enhancing participant learning of these basic concepts in neurobiology.

No significant improvement was seen for overall group performance on questions 2 and 3, although a trend toward improvement was seen for question 2. This lack of statistical significance may have been because correct responses to these questions were above 90% on pretesting, limiting our ability to discriminate pre- and posttesting differences. In contrast, correct responses to questions 1 and 4 were less than 80% on pretesting, allowing room for improvement to be demonstrated on the posttest.

The primary study outcome was this comparison of student performance on a pretest compared to a posttest administered immediately after the simulation session, and participants were significantly more likely to get all four posttest questions correct after experiencing the simulation. Additional data related to student educational performance on questions related to the simulation session was available on a summative examination given 11 days after the posttest and simulation session, which included topics in basic psychiatry (e.g., personality disorders, antidepressant medications, etc.) and basic neuroscience (e.g., neurotransmitters, dopaminergic pathways, beta-adrenergic receptors, etc.). Our study design does not allow direct comparison of student performance on the posttest to the summative examination, but students did perform well on each of the three factual recall questions on the examination that related to the topic of our simulation scenario but were distinct from the pre- and posttest questions (83, 86, and 90% correct responses for each of these three questions). This compares favorably with the overall student performance on this summative examination of 119 questions, where the average score on all of the other questions was 84% correct.

However, this level of performance was not extended to a single knowledge application question written by one of the faculty study authors, as only 58% of participants correctly answered this question. This question was unique in that participants had to recall information learned during the simulation session and then apply it to a completely different clinical scenario involving a patient exposed to a nerve gas agent. This difference in student performance on these two question types suggests that our large-group simulation exercise may have been most effective at reinforcing basic factual information and did not as effectively teach problem-solving skills that would apply to a completely novel patient situation.

While simulation is often used to apply theoretical knowledge to real-life problems, its use for assisting learners to recall the factual information associated with basic science learning is an area of investigation. The current study was therefore designed using fact-based multiple-choice questions as one measure of student performance in a basic science course, and results suggest enhanced recall of such material following simulation teaching. The design of future studies may allow evaluation of problem-solving skills such using basic science information.

Limitations

The multiple-choice questions chosen for pre- and posttesting were intentionally designed to test basic information that students had already learned in the regular curriculum and was reinforced using emergency simulation. Therefore, the number of correct responses on pretesting was already very high, making a statistically significant improvement more difficult to demonstrate. Greater score improvement may have been seen had we tested more advanced knowledge that students may not have been exposed to prior to the session. In addition, content specificity could have been increased by using a larger number of questions for testing which would have limited construct underrepresentation, although this was constrained by time factors when administering the pre- and posttest for this study.

A control group was not used in this before and after study design, because this simulation presentation is an established part of our curriculum and withholding this highly-rated experience completely from one group of students was not an available option. Therefore, we are limited to evaluate student performance on pretesting before the intervention and posttesting immediately afterwards to determine learning effectiveness. It is possible that students may have discussed the pretest and/or studied independently during the short time (3 days) between the pretest and the simulation event. Efforts were taken to minimize the limitations of the testing format, as students were not given the answers to the pretest questions and were not informed that a posttest would be administered 3 days later after a holiday weekend. These factors decrease the possibility that improved performance was simply from being tested a second time. While this study design does not allow us to completely exclude the possibility that a portion of the student improvement was due to repeated content exposure and unrelated to the simulation format itself, we observed that the enthusiastic response to the simulation aspect of this teaching experience led to greater student engagement in the learning process. While it is possible that a large group interactive lecture without simulation would lead to the same kind of improvements demonstrated here, the participant feedback suggests that this type of interactive simulation can reinforce key concepts in a way that is novel and highly-rated by students. We feel that this innovative use of simulation provides educational value even if it is equivalent to other traditional lecture or large-group discussion-based review sessions.

Our interpretation of student performance on the summative examination is limited as the three factual questions were not specifically designed to test the information presented in the scenario, but instead were basic knowledge recall questions related to the topic we presented. Whether the students’ performance on those questions is a direct result of participation in the simulation event itself, or whether it was influenced by their own motivation to perform well on an examination was not the primary aim of the this study and cannot be determined from the available data. The reasons for the difference in performance between the three factual recall questions on that test and the one knowledge application question that was written by the study author is also unclear in the current study format. Further investigation will be needed to determine whether this difference was due to question design, inadequate presentation of the material, or confusion about a specific concept.

Conclusions

As technology advances, it will continue to be more utilized in the classroom setting. This article describes a simulation-based learning exercise in which students were tested before and after a live simulation scenario, during which they were active participants, as well tested on a scheduled exam. The students not only felt the simulation experience correlated well with basic science concepts, but also showed statistically significant improvement on the pre- and posttest examinations. Our results show that this type of learning exercise may provide an alternative for “typical” lecture-style education. The concepts presented during our simulation session improved student testing immediately and may have facilitated performance on an examination 11 days later. Future studies are planned to identify whether students with different learning styles benefit differently from this type of live, interactive simulation experience.

Acknowledgements

M.T. Fitch received faculty funding support from the Brooks Scholars Program in Academic Medicine at the Wake Forest University School of Medicine and A. Brown received funding from NIH 2T35 DK07400. We thank Rebecca Neiberg, Biostatistician, Division of Public Health Sciences at Wake Forest University Health Sciences for assistance with statistical analyses.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the paper.

Additional information

Notes on contributors

Corey Heitz

COREY HEITZ, MD, MS, was a resident physician at Wake Forest University Baptist Medical Center when this study was completed. He is currently a Faculty Development Fellow in Emergency Medicine at Wright State University where he works on academic projects and educational research.

Ashley Brown

ASHLEY BROWN, BS, is a medical student at Wake Forest University School of Medicine. She worked on this simulation project as part of a summer research program funded by NIH 2T35 DK07400.

James E. Johnson

JAMES E. JOHNSON, PhD, is an Associate Professor of Neurobiology and Anatomy at Wake Forest University School of Medicine. He serves as the Director of the Anatomical Resource Clinical Training Center and the Anatomical Bequethal Program.

Michael T. Fitch

MICHAEL T. FITCH, MD, PhD, is an Assistant Professor of Emergency Medicine and Director of the Emergency Department Simulation Program at Wake Forest University School of Medicine. He combines his research background in neuroscience with emergency medicine and simulation based teaching by creating and coordinating interactive educational activities.

References

  • Bond WF, Spillane L. The use of simulation for emergency medicine resident assessment. Acad Emerg Med 2002; 9: 1295–1299
  • Charlin B, Tardif J, Boshuizen HP. Scripts and medical diagnostic knowledge: Theory and applications for clinical reasoning instruction and research. Acad Med 2000; 75: 182–190
  • Euliano TY. Small group teaching: Clinical correlation with a human patient simulator. Adv Physiol Educ 2001; 25: 36–43
  • Fitch MT. Using high-fidelity emergency simulation with large groups of preclinical medical students in a basic science course. Med Teach 2007; 29: 261–263
  • Gordon JA. The human patient simulator: Acceptance and efficacy as a teaching tool for students. The Medical Readiness Trainer Team. Acad Med 2000; 75: 522
  • Gordon JA, Brown DM, Armstrong EG. Can a simulated critical care encounter accelerate basic science learning among preclinical medical students?. Simulation Healthcare 2006; 1: 13–17
  • Gordon JA, Oriol NE, Cooper JB. Bringing good teaching cases “to life”: A simulator-based medical education service. Acad Med 2004; 79: 23–27
  • Hanley JA, Negassa A, Edwardes MD, Forrester JE. Statistical analysis of correlated data using generalized estimating equations: An orientation. Am J Epidemiol 2003; 157: 364–375
  • Harden RM. The integration ladder: A tool for curriculum planning and evaluation. Med Educ 2000; 34: 551–557
  • Koniaris LG, Kaufman D, Zimmers TA, Wang N, Spitalnik PF, Henson L, Miller-Graziano C, Sitzmann JV. Two third-year medical student-level laboratory shock exercises without large animals. Surg Infect (Larchmt) 2004; 5: 343–348
  • MacDowall J. The assessment and treatment of the acutely ill patient – The role of the patient simulator as a teaching tool in the undergraduate programme. Med Teach 2006; 28: 326–329
  • McMahon GT, Monaghan C, Falchuk K, Gordon JA, Alexander EK. A simulator-based curriculum to promote comparative and reflective analysis in an internal medicine clerkship. Acad Med 2005; 80: 84–89
  • Morgan PJ, Cleave-Hogg D. A worldwide survey of the use of simulation in anesthesia. Can J Anaesth 2002; 49: 659–662
  • Morgan PJ, Cleave-Hogg D, DeSousa S, Lam-McCulloch J. Applying theory to practice in undergraduate education using high fidelity simulation. Med Teach 2006; 28: e10–e15
  • Pring R. Curriculum integration. The curriculum: Context design and development education, R Hooper. Oliver and Boyd, Edinburgh 1970; 265–272
  • Tan GM, Ti LK, Suresh S, Ho BS, Lee TL. Teaching first-year medical students physiology: Does the human patient simulator allow for more effective teaching?. Singapore Med J 2002; 43: 238–242
  • Vozenilek J, Wang E, Kharasch M, Anderson B, Kalaria A. Simulation-based morbidity and mortality conference: New technologies augmenting traditional case-based presentations. Acad Emerg Med 2006; 13: 48–53
  • Weller JM. Simulation in undergraduate medical education: Bridging the gap between theory and practice. Med Educ 2004; 38: 32–38
  • Whitehead AN. The aims of education. The Free Press, New York 1929

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.