3,793
Views
25
CrossRef citations to date
0
Altmetric
Research Article

Pre-training evaluation and feedback improve medical students’ skills in basic life support

, , , &
Pages e549-e555 | Published online: 26 Sep 2011

Abstract

Background: Evaluation and feedback are two factors that could influence simulation-based medical education and the time when they were delivered contributes their different effects.

Aim: To investigate the impact of pre-training evaluation and feedback on medical students’ performance in basic life support (BLS).

Methods: Forty 3rd-year undergraduate medical students were randomly divided into two groups, C group (the control) and pre-training evaluation and feedback group (E&F group), each of 20. After BLS theoretical lecture, the C group received 45 min BLS training and the E&F group was individually evaluated (video-taped) in a mock cardiac arrest (pre-training evaluation). Fifteen minutes of group feedback related with the students’ BLS performance in pre-training evaluation was given in the E&F group, followed by a 30-min BLS training. After BLS training, both groups were evaluated with one-rescuer BLS skills in a 3-min mock cardiac arrest scenario (post-training evaluation). The score from the post-training evaluation was converted to a percentage and was compared between the two groups.

Results: The score from the post-training evaluation was higher in the E&F group (82.9 ± 3.2% vs. 63.9 ± 13.4% in C group).

Conclusions: In undergraduate medical students without previous BLS training, pre-training evaluation and feedback improve their performance in followed BLS training.

Introduction

Basic life support (BLS) is a very important clinical skill for medical students. Due to limited cases and randomness of cardiac arrest and ethical consideration, it is not practical for each medical student to practice BLS skills on victims of cardiac arrest. Simulation-based BLS training provides a solution for effective training in these rare events.

Evaluation and feedback are two of factors that could influence quality of simulation-based medical education (Issenberg et al. Citation2005). The time when evaluation and feedback were given might contribute to their different effects on learning. Usually, feedbacks were delivered during (concurrent feedback) BLS training and trainees were then evaluated after the BLS training was finished (post-training evaluation). It is demonstrated that concurrent feedback can promote clinical skill acquisition (Ende Citation1983; Schmidt et al. Citation1999). However, concurrent feedback during BLS training might impose a high-extraneous cognitive load on trainees and contribute to trainees’ development of dependence, which might impair feedback quality (Walsh et al. Citation2009). Evaluation is not only a tool to assess technical skills competency, the results in the evaluation also could be used as a source from which feedbacks derived. However, since the training was already ended, few of results in post-training evaluation could be delivered as feedbacks to current trainees and help them to improve their acquired skills unless further training was instructed. We suppose evaluation and feedback that were conducted before BLS training (pre-training evaluation and feedback) might have beneficial effect on medical student's BLS skill acquisition. This study is to investigate impact of pre-training evaluation and feedback on medical student's performance in BLS.

Materials and methods

This prospective, randomized, single blind, controlled cohort study was approved by the Institutional Review Board at the West China Hospital of Sichuan University. A written informed consent was obtained from all participants. A summary of the study profile was shown in .

Figure 1. A summary of study profile. C group, the control group; E&F group, pre-training evaluation and feedback group. BLS, basic life support.

Figure 1. A summary of study profile. C group, the control group; E&F group, pre-training evaluation and feedback group. BLS, basic life support.

Forty 3rd-year undergraduate medical students without previous BLS training experience were randomly (sealed envelope method) divided into two groups, C group (the control) and pre-training evaluation and feedback group (E&F group), each of 20.

Following a 45-min BLS theoretical lecture, the C group received 45 min BLS training and the E&F group was evaluated seriatim (video-taped) in a mock cardiac arrest prior to their BLS training (pre-training evaluation). In the C group, individualized concurrent feedbacks (both verbal and demonstration) were given during BLS training. In the E&F group, following 15 min of group feedback (both verbal and videos recorded in the pre-training evaluation) delivered by BLS instructors, a 30-min BLS training was then instructed. In the E&F group, the two instructors only organized the BLS training following the guidance of an American Heart Association (AHA) certified BLS CD-ROM for Students (2006) and did not give any individualized concurrent feedback during training. After BLS training, both groups were assessed with a 12-item multiple-choice written examination (Appendix 1) and evaluated one-by-one with one-rescuer BLS skills in a 3-min mock cardiac arrest scenario (video-taped) by one blinded reviewer (post-training evaluation). The mock cardiac arrest scenario was that the student was asked to do one-rescuer BLS for a sudden fell patient (Resusci® Anne Basic, Laerdal Medical Corporation, Stavanger, Norway) in a ward.

The BLS theoretical lecture included introduction of mechanism and skills demonstration of BLS via a PPT file and videos that followed the guideline of AHA (2005) (ECC Committee, Subcommittees and Task Forces of the American Heart Association 2005). BLS training was instructed following AHA certified BLS CD-ROM for Students (2006) and 10 manikins (Resusci® Anne Basic and SkillGuide™, Laerdal Medical Corporation, Stavanger, Norway) were used for BLS training in both groups. One Resusci® Anne Basic and SkillGuide™ was used in BLS evaluation in both groups. Two video cameras (Sony, Japan) were used to record details of trainee's BLS performance during skill evaluation. Four teachers, including one lecturer, two BLS training instructors, and one blinded reviewer, were AHA certified and played the same role in both groups.

After all videos (both in pre-training and post-training evaluations) were collected in random order, the score of BLS skills was evaluated by the blinded reviewer using a 23-item checklist. The checklist was designed and in co-operation with the two BLS training instructors to match the curriculum of the course and expected skill level. Each item was scored on a scale of 1 and a maximum is of 23. Poor performance or failing to perform the item was scored as 0. The checklist for BLS skill evaluation is shown in Appendix 2. Accurate (yes or no) rate of five discrete BLS skills, as effective mask ventilation, correct chest compression site, acceptable chest compression rate (90–110 compressions per minute), correct depth of chest compression (1.5–2 inches) and limited cardiopulmonary resuscitation (CPR) free intervals (<15 s) were evaluated, respectively, as well. The blinded reviewer was trained how to evaluate BLS skills via recorded video (not used for BLS training) according to checklist mentioned-above before this study.

Primary outcome measure and statistical analysis

The primary outcome was the improvement of score of BLS skills during post-training evaluation. A sample size of 30 was determined to identify a 15% improvement of overall band score of BLS skills during post-training evaluation with respect to a type I error of 0.05 and power of 0.9. The scores from the multiple-choice written examination and BLS skills evaluation were converted to a percentage and were compared between the two groups using Student's t-test and reported as a mean ± SD. Chi-squared test was used to analyze difference of accurate rate of five discrete BLS skills during post-training evaluation between the two groups. A p < 0.05 was considered to be statistically significant.

Results

No student has previous personal or professional experiences in BLS in all 40 enrolled medical students and all of them completed investigation protocol. The demographic data were shown in . Score of BLS skills in pre-training evaluation in E&F group is 40.2 ± 12.6% (95%CI, 22.2–60.0%). There was no significant difference of scores of the multiple-choice examination between the two groups (79.8 ± 9.2% in C group vs. 83.8 ± 6.3% in E&F group, p = 0.12). Higher score was observed in E&F group (82.9 ± 3.2%, 95% CI (80.0–88.9%)) in post-training evaluation, compared with C group (63.9 ± 13.4, 95% CI (41.1–88.9%)), p < 0.01. Except correct compression site, higher accurate rate of other four discrete BLS skills were observed in E&F group, p < 0.05 ().

Table 1.  Demographic data.

Table 2.  Scores from pre-training evaluation in E&F group, and the multiple-choice examination and post-training evaluation in two groups.

Discussion

Feedback and evaluation are two of important characteristics in simulation-based medical education and they have variable influence on learning through their format, content, and programming. In this study, although individualized concurrent feedbacks were delivered to trainees during their training in the C group, better outcome was observed in the E&F group, in which group feedback was given prior to their BLS training. We presumed that multiple factors might contribute to this result.

The time of feedback and evaluation conducted was thought to be a contributor of the different outcomes in two groups. The time of feedback delivered has been shown to have different effects on motor learning (Xeroulis et al. Citation2007). Concurrent feedback is the feedback that people receive during their skills performance. It is demonstrated that concurrent feedback can promote clinical skill acquisition (Ende Citation1983; Schmidt et al. Citation1999). However, according to cognitive load theory, humans have a limited attentional capacity such that they can only focus on a limited amount of information simultaneously (Sweller et al. Citation1998; Schmidt et al. Citation1999; Xeroulis et al. Citation2007). Concurrent feedback may impose a high-extraneous cognitive load to both trainee and the instructor. For the trainee, the increased amount of cognitive processing required performing a task while receiving, interpreting, and responding to concurrent feedback may interrupt their attention on learning the moves required to solve the problem at hand (Sweller et al. Citation1998). Giving concurrent feedback may also place excessive cognitive demands on instructors since they are required to observe the trainee's performance while simultaneously providing feedback (Sweller et al. Citation1998). This extra work load for instructors may potentially disrupt their concentration and impair feedback quality. Terminal feedbacks are feedbacks given after skills performance. In this study, although feedback was given prior to BLS training (pre-training feedback) in the E&F group, it is per se a kind of terminal feedback, as it was given after pre-training evaluation was completed (Ende Citation1983; Schmidt et al. Citation1999). Terminal feedback is considered to be superior in promoting learning during simulation-based education (Xeroulis et al. Citation2007). Benefit of pre-training feedback is that errors during pre-training evaluation can be allowed to progress, so both instructors and trainees learn from those mistakes. Pre-training feedback might highlight the key points of BLS for trainees and guide both instructors and trainees’ focus and attention in the coming training. Consequently, quality and efficacy of training is improved. Thus, use of pre-training feedback as a learning tool for BLS training is more significant in the E&F group, compared with concurrent feedback instructed in the C group.

In addition, better results in the E&F group might be partially attributed to pre-training evaluation. It is a general assumption that “assessment drives learning” through so-called testing effect, a robust and independent phenomenon that was demonstrated to apply to a variety of test formats and levels of knowledge (Brazeau et al. Citation2002; McLachlan Citation2006; Roediger & Karpicke Citation2006; Schoonheim-Klein et al. Citation2006; Walsh et al. Citation2009; van der Vleuten Citation1996). Evaluation is not only a tool to assess technical skills competency, but also to provide an opportunity (feedbacks) for instructors to know either flaws of their own instructions or students’ shortcoming in training. And these feedbacks might help instructors to focus more on helping trainees in difficult points and improve the quality of their instructions in the future. But the traditional programming BLS course (evaluation is usually arranged after training is finished) limits the beneficial effects of feedbacks from post-training evaluation on the current trainees. Few of those feedbacks derived from post-training evaluation could be delivered to current trainees since the training was already ended. Few of trainees’ acquired skills could be improved without repetitive practice unless further training was instructed. Compared with traditional post-training evaluation in the C group, pre-training evaluation was conducted before BLS training in the E&F group. Pre-training evaluation is an effective way not only to identify trainee's basic skill level, but also to reveal trainee's mistakes, which is quite useful to be delivered as a pre-training feedback (terminal feedback per se) to learning from both for instructors and trainees. Compared with limited effects of feedbacks derived from post-training evaluation on trainees’ acquired skills in the C group without further practice, feedback originated from pre-training evaluation could help both the instructors and trainees to concentrate on those key points of BLS in the following training in the E&F group. As a result, quality of BLS training was improved and better performance observed in the E&F group.

Furthermore, objects to whom feedbacks were delivered could be another contributor for different results in the two groups. The object of individualized concurrent feedback in the C group was each individual trainee in the C group. Individualized concurrent feedback in the C group during BLS training might offer each trainee individualized instructive information related to his/her performance and is beneficial for improving their performance individually. However, unnecessary time might be spent in giving similar feedback to different individuals as different learners might make the same mistake. More interruption might occur during BLS training. Due to time limitation of the BLS training course, the quantity, quality, and efficacy of individualized feedback could be impaired consequently. In the E&F group, the object of feedbacks was the whole group. Group feedback originated from pre-training evaluation in the E&F group highlighted key points of BLS for trainees and minimizes unnecessary time spent on correcting individual's similar mistakes during BLS training. Therefore, less interruption might occur during BLS training in the E&F group and both instructors and trainees could focus more on training and practice. Moreover, group feedback also provides trainees in the E&F group an opportunity to learn from each other and even competition might arise among students, which might stimulate students’ enthusiasm to learn and improve the quality and efficacy of training. These might explain that less training time was spent in the E&F group, but better outcome was observed, compared with the C group. This result suggests that appropriately more time might be spent in improving quality and efficacy of evaluation and feedback, rather than in training (Issenberg et al. Citation1999; Walsh et al. Citation2009).

Feedback and evaluation play important roles in simulation-based medical education and they affect learning through their format, content, and programming. To maximize learning, feedback and evaluation should be appropriately integrated into simulation-based training. However, the best integration of them should be investigated in the future.

Several design limitation of this study should be concerned. Better outcome in E&F group could also be partially attributed to that they got familiar with how the evaluation would have been conducted through pre-training evaluation and preparation and familiarity might form to adapt to the post-training evaluation protocol. We could have exposed the control group to pre-training evaluation protocol without group feedback to investigate merely the impact of pre-training group feedback on students’ BLS skill acquisition. But we should not ignore the fact that pre-training group feedback could not be conducted without evaluation. So, our study was to investigate the combined impact of these two features of simulation education, evaluation and feedback. Another limitation of this study is that the two instructors in both groups could not be blinded because duration of BLS training was different between the two groups. Therefore, personal bias could not be avoided. Time spent for pre-training evaluation was another bias factor when we identify better outcome in the E&F group. In the E&F group, higher acute rates were not observed in all of the five discrete BLS skills (correct compression site was comparable between the two groups). This result showed although the 23-item checklist was good tool to assess trainees’ proficiency of the whole BLS algorithms, it is not always consistent to reflect competency of each discrete BLS skill. That was why we added accurate rate comparison of five discrete BLS skills that were emphasized by AHA because those discrete BLS skills could influence the outcomes of BLS. Furthermore, the potency and reliability of the checklist had not been tested in this study and should be identified in the future.

Conclusions

In undergraduate medical students without previous BLS training, pre-training evaluation and feedback improve their performance in immediately followed BLS training. A future study with longer follow-up periods to estimate the effect on long-term retention of learning outcome is needed.

Acknowledgments

This study was funded by a Medical Education Research Grant from Medical Education Committee of Chinese Medical Association and Medical Education Association of High Education Society of China in 2010. Grant number: 2010-15-06. The authors thank the medical students who participated in the study, the support from Clinical Skills Training Center, West China Hospital of Sichuan University. The authors gratefully acknowledge the assistance of Dr Douglas Hester from Vanderbilt Medical Center, TN, USA, in critically reviewing this article.

Declaration of interest: The authors report that there is no commercial or non-commercial affiliation that are or may be perceived to be a conflict of interest with the work. The authors alone are responsible for the content and writing of this article.

References

  • Brazeau C, Boyd L, Crosson J. Changing an existing OSCE to a teaching tool: The making of a teaching OSCE. Acad Med 2002; 77: 932–936
  • ECC Committee, Subcommittees and Task Forces of the American Heart Association. 2005 American Heart Association Guidelines for Cardiopulmonary Resuscitation and Emergency Cardiovascular Care. Circulation 2005; 112(24 Suppl)IV1–203
  • Ende J. Feedback in clinical medical education. JAMA 1983; 250: 777–781
  • Issenberg SB, McGaghie WC, Hart IR, Mayer JW, Felner JM, Petrusa ER, Waugh RA, Brown DD, Safford RR, Gessner IH, et al. Simulation technology for health care professional skills training and assessment. JAMA 1999; 282: 861–866
  • Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach 2005; 27: 10–28
  • McLachlan JC. The relationship between assessment and learning. Med Educ 2006; 40: 716–717
  • Roediger HL, Karpicke JD. The power of testing memory: Basic research and implications for educational practice. Perspect Psychol Sci 2006; 1: 181–276
  • Schmidt RA, Lee TD, Editors. Motor control and learning: A behavioural emphasis, 3rd ed. Human Kinetics, Champaign, IL 1999
  • Schoonheim-Klein ME, Habets L, Aartman IHA, van der Vleuten CP, Hoogstraten J, van der Velden U. Implementing an objective structured clinical examination (OSCE) in dental education: Effects on students’ learning strategies. Eur J Dent Educ 2006; 10: 226–235
  • Sweller J, van Merriënboer JJG, Paas F. Cognitive architecture and instructional design. Educ Psychol Rev 1998; 10: 251–296
  • van der Vleuten CPM. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract 1996; 1: 41–67
  • Walsh CM, Ling SC, Wang CS, Carnahan H. Concurrent versus terminal feedback: It may be better to wait. Acad Med 2009; 84(10 Suppl)S54–S57
  • Xeroulis GJ, Park J, Moulton CA, Reznick RK, Leblanc V, Dubrowski A. Teaching suturing and knot-tying skills to medical students: A randomized controlled study comparing computer-based video instruction and (concurrent and summary) expert feedback. Surgery 2007; 141: 442–449

Appendix 1

Questionnaire of BLS

Test of BLS

  • 1 A 45-year-old man fell unconscious when he did exercise in a fitness center, where should you check his pulse to identify whether he had cardiac arrest?

  • A To check the pulse of the radial artery.

  • B To check the pulse of the brachial artery.

  • C To check the pulse of the carotid artery.

  • D To check the heartbeat.

  • 2 According to the 2005 AHA guideline of BLS, please describe the right sequence of BLS________________________________.

  • A Call for help.

  • B Press the nasolabial fold.

  • C Wait for the arrival of the rescuers.

  • D Perform the chest compression at once.

  • E Perform the mouth-to-mouth breath at once.

  • F Check the pulse.

  • G Check the breath.

  • H Check the conscious.

  • 3 What is the correct frequency of the chest compression?

  • A Faster and better.

  • B 70–120 bpm for the adult.

  • C 100 bpm both for adult and children.

  • D 100 bpm for adult and 150 bpm for children.

  • 4 The best frequency of artificial ventilation is ().

  • A Faster and better.

  • B Three ventilations per minute for adult.

  • C Two artificial ventilations first both for adult and children, and then followed by other treatment.

  • D 10–15 ventilations per minute for adult and 20 ventilations per minute for children.

  • 5 How to check the patient's breath?

  • A Put your finger close to the patient's nostril and feel the air-flow of breathing.

  • B Watch the movement of the chest wall.

  • C Put your ear close to the patient's nose, watch the chest movement, listen to breathing sound, feel the air flow of breathing.

  • D Head tilt and chin lift to open the airway.

  • 6 How to check the patient's consciousness?

  • A To tap the patient's face.

  • B To tap the patient's forehead.

  • C To tap the patient's chest.

  • D To tap the patient's shoulder.

  • 7 How long should breath check last?________

  • 8 How long should pulse check last?________

  • 9 How to open the airway for a patient with suspected neck injury?

  • A Head tilt and chin lift.

  • B To try to thrust his jaw and keep his neck still. If it is impossible, to perform head tilt and chin lift.

  • C To use your finger to clean foreign body in his mouth and pull his tongue out of his mouth.

  • D Since neck injury was suspected, leave him alone.

  • 10 What's the correct ratio of chest compression to artificial ventilation during BLS? ______:_____

  • 11 Where should you put your hands on to perform the chest compression?________

  • A The upper half of the sternum

  • B The left chest, nipple level and above the heart

  • C The bottom of the sternum, above the xiphoid

  • D The cross-site between the sternum and the nipple line

  • 12 Which of following descriptions are correct for BLS?

  • A A hard plate should be placed under the patient's body

  • B A frequency of 100 bpm for chest compression

  • C Hard push

  • D Make sure the chest wall was fully recoiled

  • E Make sure your palm was away from the patient's chest during ventilation

  • Note: bpm, beat per minute.

Appendix 2

Item checklist for basic life support skill evaluation

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.