1,719
Views
10
CrossRef citations to date
0
Altmetric
Research Articles

Impact of tailored feedback in assessment of communication skills for medical students

, , , , &
Article: 28453 | Received 06 May 2015, Accepted 02 Jun 2015, Published online: 06 Jul 2015

Abstract

Background

Finding out the effective ways of teaching and assessing communication skills remain a challenging part of medication education. This study aims at exploring the usefulness and effectiveness of having additional feedback using qualitative analysis in assessment of communication skills in undergraduate medical training. We also determined the possibilities of using qualitative analysis in developing tailored strategies for improvement in communication skills training.

Methods

This study was carried out on medical students (n=87) undergoing their final year clinical performance examination on communication skills using standardized patient by video-recording and transcribing their performances. Video-recordings of 26 students were randomly selected for qualitative analysis, and additional feedback was provided. We assessed the level of acceptance of communication skills scores between the study and nonstudy group and within the study group, before and after receiving feedback based on qualitative analysis.

Results

There was a statistically significant increase in the level of acceptance of feedback after delivering additional feedback using qualitative analysis, where the percentage of agreement with feedback increased from 15.4 to 80.8% (p<0.001).

Conclusions

Incorporating feedback based on qualitative analysis for communication skills assessment gives essential information for medical students to learn and self-reflect, which could potentially lead to improved communication skills. As evident from our study, feedback becomes more meaningful and effective with additional feedback using qualitative analysis.

Good communication skills of a doctor are central to delivery of high-quality medical care and have been shown to effect patient satisfaction and a variety of other biological, psychological, and social outcomes of patients (Citation1Citation5).

Communication skills training has become an integral part of the undergraduate medical curriculum, where medical students are taught how to optimize the clinical effectiveness of communication skills for future clinical practice (Citation6, Citation7). Therefore, communication skills must be assessed rigorously and objectively. Generally, Standardized Patient (SP) is the widely employed method for assessing communication and clinical skills (Citation8). However, the assessment of communication skills can be subjective and has its inherent limitations (Citation9). Furthermore, effective communication skills and their mechanism are difficult to investigate and measure (Citation10).

Although there is ample evidence that incorporation of communication skills training in medical curriculum has been effective, little is known about the effective strategies for improving communication skills training (Citation11, Citation12). For educators, there are variations in individual abilities in providing detailed and constructive feedback on the students’ communication skills (Citation13, Citation14).

South Korea was the first country in Asia to introduce clinical skills examination as part of the medical licensing exams, where communication skills assessment was integrated into the Clinical Performance Examination (CPX) using SP (Citation15). However, during the initial introductory phase of CPX, there was widespread criticism from the students on the lack of adequate communication skills training prior to CPX and there was disagreement with the scores given by SPs during CPX (Citation16).

To enhance communication skills training in our institution, various feedback resources were offered to students aiming to deliberate practice-based learning after CPX. These included comments from SPs and opportunity to self-review the video-clips of their performance (Citation5, Citation17Citation21). However, despite these measures, students had difficulty in identifying their strengths and limitations in communication skills. Medical students received insufficient guidance and feedback on their communication skills with the existing setup, which has been previously reported in other institutions (Citation22, Citation23).

The aim of this study is to assess the role of additional feedback tool using qualitative analysis in communication skills assessment, which provides more detailed analysis of an individual student's verbal and nonverbal communication skills. Furthermore, we aim to develop a more effective communication skills training using qualitative analysis which studies the frequently observed behaviors of students during simulated consultations.

Methods

Clinical performance examination and assessment by standardized patient

In 2010, the final year students in our medical school underwent CPX consisting of six case scenarios under the auspices of Seoul-Gyeonggi CPX consortium.

Each student was given 12 min to assess one SP. After each encounter, the SP rated the student's performance using a checklist developed by the consortium. The checklist encompassed three areas: history taking, physical examination, and communication skills. The criteria for assessing communication skills consisted of the following seven sub-items: 1) the doctor made me feel comfortable/friendly; 2) the doctor listened to me sufficiently; 3) the doctor showed interest in my quality of life in addition to the health condition; 4) the doctor maintained a comfortable environment throughout the consultation; 5) the doctor treated me with respect and good manners; 6) the doctor's explanation was easy to understand; and 7) I felt confident with the doctor. The SP scored the above seven sub-items with a Likert scale of 0 to 5 points. Once this was completed, students were informed of their scores without any further reflective sessions.

Assessments were all video-recorded as part of our regular assessment, and informed consent was obtained prior to the assessment.

Qualitative approach to understand students’ behavior in assessment

We built a team consisting of two linguists and two medical educators. Firstly, the team established qualities required for good communication skills based on existing six sub-items. We randomly selected 30 out of 87 video-clips for the assessment and 4 were excluded due to technical reasons.

The video-recordings were transcribed and analyzed under two main domains: verbal and nonverbal communication. We analyzed the verbal elements of communication using the transcripts, while the nonverbal elements were analyzed using the video-clips.

Once the linguists completed the assessment on communication skills, medical educators discussed the results with them in a case conference. A detailed feedback form was completed for each student highlighting the positive and negative points. In addition, we looked into the patterns of behaviors in students to fully evaluate their communication skills.

Feedback to students

Two months after the examination, we delivered our routine feedback for all students (n=87), which included the scores and SP comments. All students had the opportunity to review the video-clips of their performance. We provided additional feedback using qualitative analysis to the randomly selected students in the study group (n=26). At the end of the session, we assessed the level of agreement of students with their scores on communication skills from both groups. Students also provided feedback on the effectiveness and usefulness of the additional feedback.

Content analysis/thematic analysis

Contents within the personal feedback forms provided to the study group were divided into categories in the assessment tool depending on their intentions observed by researchers. They coded the categorized contents into specific skills, which students demonstrated to deliver the meaning. The coded skills were checked by the frequency of involvement and further categorized as either positive or negative point based on the transcript and overall context.

Assessing the level of acceptance of scores after additional feedback based on qualitative analysis

The level of acceptance of scores was obtained using a Likert scale of five points: 1 (totally disagree) to 5 (totally agree). We further sorted the scale of five points into three categories: disagree, neutral, and agree.

Firstly, we analyzed the differences in the level of acceptance of scores between the control and study group after initial routine feedback session, without any additional feedback. We then determined the differences in the level of acceptance of scores between the control and study group, after receiving additional feedback based on qualitative analysis. Difference in the level of acceptance was analyzed using χ 2 test using SPSS Version 14.

Ethics approval was obtained from the Institutional Review Board (IRB), College of Medicine at Chung Ang University.

Results

The level of acceptance of scores between the control and study group showed no significant difference after the initial routine feedback session. However, there was a significant increase in the level of acceptance of scores in the study group after receiving additional feedback, increasing from 15.4 to 80.8% (see ).

Table 1 Change in acceptance of feedback by adding information based on qualitative analysis

Students’ response after routine feedback

After our routine feedback following CPX, there was general consensus amongst students that they were dissatisfied with the feedback provided, where 45 out of 87 students disagreed with their communication skills scores. Disagreement was largely due to lack of appropriate explanation for their scores. Although we trained SPs to provide detailed comments to every student, due to time constraints during the examination, it was not always possible to provide a detailed feedback. As a result, some students had no feedback on their strengths and weaknesses, which was cordially raised afterwards.

Students’ response after additional feedback using qualitative analysis

Students from the study group who received additional feedback with detailed analysis of their verbal and nonverbal communication were overall satisfied with the intervention. They commented that it was easier to identify the specific areas of strengths and weaknesses in communication skills. Students were able to replay the video-clips with guided commentaries on their performance. Nobody in the study group disagreed with their communication skills scores after intervention.

Benefits for students with good scores

We have found out that the additional feedback was helpful for students with good scores as well. This group had received high scores in communication skills in the past but could not identify exactly how and why they performed well. Additional feedback provided insight into their qualities in demonstrating effective communication and enabled to build and maintain their strengths in communication skills.

Frequently observed behaviors

Students showed similar behaviors which were frequently observed (see ):

  1. ‘Smiling and eye contact’: Smiles and good eye contact softened the mood of consultation and made SPs feel more comfortable. However, in some instances, students smiled looking elsewhere (e.g., the desk in front).

  2. ‘Was it easy to find the hospital?’: When this comment was used at the beginning of the consultation, it made the SP comfortable and set the scene for the examination. It was perceived as a courteous and caring gesture of the student towards SP. However, in some cases, this comment was made in the middle of the consultation.

  3. ‘How does it affect your daily activities?’: Several students successfully asked the effect of disease on the daily activities of the SP. This made the SP feel that the student was not just interested in the disease or health condition, but also in SP's quality of life. However, some students asked this question out of context, making SPs feel that the student was intruding on their private matters.

  4. ‘It must be very difficult for you’: This expression was frequently used as a response to SP's clinical history and difficulties. When such a comment was used appropriately, SP felt that the student was empathic to their symptoms. However, in Korean language, this sentence could be misinterpreted as ‘you appear to be physically suffering’. Therefore, when this comment was made abruptly out of context, the SP felt patronized, especially when it was used inappropriately (e.g., one SP thought the student was commenting on her physical appearance).

  5. ‘Everything will be fine’: This expression was commonly used when students attempted to reassure SPs showing signs of anxiety. There were instances where SPs felt reassured with this comment, but in some cases, students’ verbal and facial expressions were not corresponding.

  6. ‘Paraphrasing and summarizing’: It was observed from several students. This was a good opportunity for students to check their understanding and confirm the history from SPs. However, when this was used too frequently or inappropriately, SPs felt that the students were not paying attention hence asking the SP to repeat.

  7. ‘Respectful posture’ (sitting upright): During consultation, most students sat upright. Maintaining such posture throughout consultation is perceived as being respectful in Korean culture. We advise our students to avoid certain disrespectful behaviors such as crossing their legs, playing with their pens, and not keeping still. However, some students appeared too rigid and the SPs felt uncomfortable during consultation.

  8. ‘Pause’: A period of silence during consultation gave appropriate time for the SP to speak and recuperate. However, prolonged silence made the SP feel uncomfortable and even anxious, wondering whether the SP had said something wrong.

Table 2 Frequently observed behaviors of students during interview and corresponding feedback

Discussion

This study highlighted the need for improvement in delivering the feedback for communication skills examination to both students and educators in the following areas.

Enhance self-reflection through practice-based learning

Improving communication skills in terms of ‘practice-based learning’ involves multi-dimensional qualities: students should have personal insight into their performance (Citation11), they should be able to learn from their encounters (Citation24), and should be able to make changes in behavior based on self-reflection (Citation25). Therefore, the ability to reflect on their behavior is essential in improving their communication skills. However, it is questionable whether students’ self-assessment would be reliable or accurate (Citation26Citation28). The importance of providing additional feedback or educational intervention to improve self-assessment and reflection has been previously highlighted (Citation26, Citation29).

Srinivasan et al. (Citation26) studied the students’ self-assessment of their video-clips from CPX. It suggested significant improvement when additional information was provided. Martin et al. (Citation29) argued that doctors’ self-evaluation skills improved when they watched and compared different levels of performance of their peers on video-clips. Similarly, we also observed that when students compared each other's video-clips, there was some improvement in their understanding of effective communication skills, but only to some extent. In students lacking the ability to self-reflect, this method was insufficient to provide adequate feedback and the intention to improve on communication skills. More concrete accounts addressing their own verbal or nonverbal cues were required.

Therefore, providing detailed feedback on students’ own behavior with straightforward references (such as time of video-clips or line/page numbers of transcripts) could be helpful to understand their strengths and weaknesses effectively. For example, one student described, ‘Previously, my results were just numbers so I had to guess my weaknesses. But with the detailed feedback, I can understand what my weaknesses are because they were clearly explained’ (Student 1).

Understanding the overall ‘context’

We observed that students understood the overall ‘context’ of doctor–patient communication as they received additional feedback and self-reflected. From analyzing the frequently observed behaviors and student feedback, we identified the following elements, which resulted in poor communication despite students’ good intentions: 1) nonverbal languages, 2) eye-contact, and 3) timing.

When nonverbal language did not correspond to the verbal language, it delivered completely different social meaning to patients. ‘I thought I listened well, maintaining good manners and was courteous. However, from the feedback, I realized that my facial expression was different to what I said and realized that it could give a completely different impression’ (Student 2).

Another common realization from the students was the importance of eye contact. Reassuring words without appropriate eye contact could lead to patient feeling that the student did not care much, but was just ‘saying’ it. ‘I thought telling the SP that “everything was all right” would be reassuring. However, I realized that I could give a wrong impression despite saying the right thing, which was not my intention’ (Student 3).

Timing is also an important cue in understanding the context of communication. Teaching manuals often highlight ‘key’ words or phrases that aid to a comfortable consultation. However, there is a tendency for students to mechanically memorize these phrases and use it inappropriately, leading to breakdown in patient rapport and communication. In our study, this was a common observation in students that such ‘key’ phrases were used at the incorrect moment during the consultation. For example, a student thought that he/she was being polite by saying, ‘Was it easy to find the hospital?’ However, this question was used at the middle of consultation rather than during the initial stage and the SP felt confused and found it out of context.

The feedback based on qualitative analysis suggested that having a natural smile, appropriate eye-contact/posture, interim summarization, and active listening were important cues to make SPs feel more comfortable. Our students initially felt that they had adhered to the suggested behaviors. However, after receiving detailed feedback, they realized that their good intentions were not represented effectively: they smiled but it was unnatural, they tried to maintain eye contact but it was inappropriate, students remembered to summarize but it was too frequent, and they tried to listen to the patients, but the gap between the conversations was too long.

Implication to medical educators

Common mistakes demonstrated by students could be associated with the existing method of communication skills training, where emphasis is made on didactic teaching on the theoretical principles of communication skills. As a result, students memorized the list of good communication principles and attempted to include all theories into the consultation, mostly out of context. Effective communication requires the ability to adapt, to be responsive, and to manage self-awareness during the process of talking and listening. Applying a random list of good communication cues did not work for some students, especially when they used the cues mechanically. Such undesirable patterns could be reflective of ineffective communication skills training, particularly in the Far East, where the importance of communication skills can often be underestimated.

Suggested areas of improvement

Despite our study demonstrating the usefulness of providing additional feedback using qualitative analysis, it would be unrealistic to adopt this method in every CPX. It is time consuming and expensive. However, it is potentially useful in identifying the strengths and weaknesses in communication skills in a small cohort of students and in investigating the potential causes.

We identified limitations with the current setup of providing routine feedback using SPs. SPs must be adequately trained and assisted to provide appropriate level of feedback. Our study demonstrated that students benefited from detailed feedback, which was objective and consistent. For students who received no remarks due to time constraints on SPs, there was no educational benefit. Therefore, enough time should be allocated for SPs to give formative feedback to students. Additionally, there was a tendency for SPs to become subjective depending on students’ remarks (e.g., a student who complemented SP by saying she looked young for her age received good scores). This could be resolved by providing further training for SPs using examples from our study. Currently, SPs solely provide scores and feedback on communication skills. Therefore, another way to prevent subjectivity would be to have a second opinion, preferably a clinician. However, it remains a challenge in recruiting and training clinicians to be involved in CPX and providing adequate feedback on communication skills.

Most Korean medical schools follow the traditional curricula, where there is limited patient contact during their 6 years of undergraduate training. Students have their first patient contact during their clerkship period in their final year. Theory-based teaching is most commonly adopted. However, good communication skills cannot be effectively taught in lecture theatres. Students require exposure to more practical sessions in both simulated and clinical settings. Introducing patient contact at an earlier stage of undergraduate medical training will be beneficial for students, providing sufficient opportunity to communicate with real-patients. However, students should also have adequate prior training in a simulated environment with SPs (Citation30).

Limitations

This study has its inherent limitations. Small sample size, lack of randomization, and blinding could all potentially contribute to bias in the results. Our study was also conducted in an examination setting using SPs, which may not be representative of real-life clinical setting. There is also potential influence of having the input of clinicians and communication specialists in the study group, especially in a culture that values obedience and respect for educators.

Conclusions

Our study demonstrated that providing students with additional feedback using qualitative analysis is beneficial for both students and educators. More students were able to agree with their scores and, most importantly, received adequate tool to improve on their communication skills through self-reflection. Equally for educators, qualitative analysis highlighted the common mistakes made by students in communication skills assessment. This study further discussed the need for increased participation and detailed feedback from SP and medical educators.

A uthors’ contributions

SU drafted the first draft and edited the final draft. GHL reviewed and analyzed data, wrote and edited the final draft. JKJ conducted data analysis. YIB designed the research and provided theoretical frameworks for qualitative analysis. YOJ conducted data analysis. CWK reviewed the final draft as the principal investigator.

Conflict of interest and funding

Seilin Uhm was supported by the Preterm Birth Programme since 2011, which presents independent research funded by the National Institute for Health Research (NIHR) under its Programme Grants for Applied Research funding scheme (RP-PG-0609-10107). The authors have no conflict of interests.

Acknowledgements

The views expressed are those of the authors and not necessarily those of the NHS, the NIHR, or the Department of Health.

References

  • Simpson M , Buckman R , Stewart M , Maguire P , Lipkin M , Novack D , etal. Doctor–patient communication: the Toronto consensus statement. BMJ. 1991; 303: 1385–7.
  • World Federation for Medical Education. World Summit on Medical Education. Proceedings. Edinburgh, 8–12 August 1993. Med Educ. 1994; 28(Suppl 1): 1–171.
  • Kurtz S , Silverman J , Benson J , Draper J . Marrying content and process in clinical method teaching: enhancing the Calgary–Cambridge guides. Acad Med. 2003; 78: 802–9.
  • Ruiz-Moral R , Perez Rodriguez E , Perula de Torres LA , de la Torre J . Physician–patient communication: a study on the observed behaviours of specialty physicians and the ways their patients perceive them. Patient Educ Couns. 2006; 64: 242–8.
  • Zick A , Granieri M , Makoul G . First-year medical students’ assessment of their own communication skills: a video-based, open-ended approach. Patient Educ Couns. 2007; 68: 161–6.
  • Hauer KE , Boscardin C , Gesundheit N , Nevins A , Srinivasan M , Fernandez A . Impact of student ethnicity and patient-centredness on communication skills performance. Med Educ. 2010; 44: 653–61.
  • Lumma-Sellenthin A . Talking with patients and peers: medical students’ difficulties with learning communication skills. Med Teach. 2009; 31: 528–34.
  • Hauer KE , Hodgson CS , Kerr KM , Teherani A , Irby DM . A national study of medical student clinical skills assessment. Acad Med. 2005; 80(Suppl): S25–9.
  • Huntley CD , Salmon P , Fisher PL , Fletcher I , Young B . LUCAS: a theoretically informed instrument to assess clinical communication in objective structured clinical examinations. Med Educ. 2012; 46: 267–76.
  • Drew P , Chatwin J , Collins S . Conversation analysis: a method for research into interactions between patients and health-care professionals. Health Expect. 2001; 4: 58–70.
  • Noble LM , Kubacki A , Martin J , Lloyd M . The effect of professional skills training on patient-centredness and confidence in communicating with patients. Med Educ. 2007; 41: 432–40.
  • Martin J , Lloyd M , Singh S . Professional attitudes: can they be taught and assessed in medical education?. Clin Med. 2002; 2: 217–23.
  • Chang A , Boscardin C , Chou CL , Loeser H , Hauer KE . Predicting failing performance on a standardized patient clinical performance examination: the importance of communication and professionalism skills deficits. Acad Med. 2009; 84(Suppl): S101–4.
  • Dudek NL , Marks MB , Regehr G . Failure to fail: the perspectives of clinical supervisors. Acad Med. 2005; 80(Suppl): S84–7.
  • Kim KS . Introduction and administration of the clinical skill test of the medical licensing examination, republic of Korea (2009). J Educ Eval Health Prof. 2010; 7: 4.
  • Huh S . Failed examinees’ legal challenge over the clinical skill test in the Korean Medical Licensing Examination. J Educ Eval Health Prof. 2010; 7: 5.
  • Ward M , MacRae H , Schlachta C , Mamazza J , Poulin E , Reznick R , etal. Resident self-assessment of operative performance. Am J Surg. 2003; 185: 521–4.
  • Epstein RM . Mindful practice. JAMA. 1999; 282: 833–9.
  • Hays RB , Jolly BC , Caldon LJ , McCrorie P , McAvoy PA , McManus IC , etal. Is insight important? Measuring capacity to change performance. Med Educ. 2002; 36: 965–71.
  • Gordon MJ . Self-assessment programs and their implications for health professions training. Acad Med. 1992; 67: 672–9.
  • Langendyk V . Not knowing that they do not know: self-assessment accuracy of third-year medical students. Med Educ. 2006; 40: 173–9.
  • Howley LD , Wilson WG . Direct observation of students during clerkship rotations: a multiyear descriptive study. Acad Med. 2004; 79: 276–80.
  • Colletti LM . Difficulty with negative feedback: face-to-face evaluation of junior medical student clinical performance results in grade inflation. J Surg Res. 2000; 90: 82–7.
  • Borrell-Carrio F , Epstein RM . Preventing errors in clinical practice: a call for self-awareness. Ann Fam Med. 2004; 2: 310–16.
  • Ziegelstein RC , Fiebach NH . ‘The mirror’ and ‘the village’: a new method for teaching practice-based learning and improvement and systems-based practice. Acad Med. 2004; 79: 83–8.
  • Srinivasan M, Hauer KE, Der-Martirosian C, Wilkes M, Gesundheit N. Does feedback matter? Practice-based learning for medical students after a multi-institutional clinical performance examination. Med Educ 2007; 41: 857–65..
  • Davis DA , Mazmanian PE , Fordis M , Van Harrison R , Thorpe KE , Perrier L . Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006; 296: 1094–102.
  • Wood J , Collins J , Burnside ES , Albanese MA , Propeck PA , Kelcz F , etal. Patient, faculty, and self-assessment of radiology resident performance: a 360-degree method of measuring professionalism and interpersonal/communication skills. Acad Radiol. 2004; 11: 931–9.
  • Martin D , Regehr G , Hodges B , McNaughton N . Using video-taped benchmarks to improve the self-assessment ability of family practice residents. Acad Med. 1998; 73: 1201–6.
  • Schaufelberger M , Frey P , Woermann U , Schnabel K , Barth J . Benefits of communication skills training after real patient exposure. Clin Teach. 2012; 9: 85–8.