1,747
Views
1
CrossRef citations to date
0
Altmetric
Research Article

A structured multimodal teaching approach enhancing musculoskeletal physical examination skills among undergraduate medical students

ORCID Icon
Article: 2114134 | Received 17 Jan 2022, Accepted 12 Aug 2022, Published online: 22 Aug 2022

ABSTRACT

Current evidence indicates that undergraduate medical students display deficits in musculoskeletal physical examination skills (MPES). While various instructional methods are recommended for teaching clinical skills, effective methods for teaching MPES have not been established. This study compared the effectiveness of a multimodal teaching approach incorporating video-based learning, interactive small-group teaching, hands-on practicing, peer-assisted learning, formative assessment, and constructive feedback with traditional bedside teaching in developing undergraduate orthopedic MPES. Participants were 151 fifth-year medical students divided into two groups. One group received multimodal teaching, and the other received traditional bedside teaching. In both groups, the participants learned how to physically examine the knee and shoulder. The primary outcome was objective structured clinical examination (OSCE) scores, while the secondary outcomes included teaching sessions’ total durations, facilitator’s demonstration time, participants’ practice time, and proportion of students with passing checklist scores and global ratings-based assessments for the two teaching approaches. The multimodal teaching group had significantly higher OSCE scores (checklist scores, global ratings, and passing rates; p = 0.02, 0.02, 0.01, respectively) than the comparison group. Individual OSCE component assessments showed significant improvements in the special musculoskeletal physical examination test. The overall duration and amount of participants’ hands-on time were significantly longer for the multimodal than for the traditional bedside teaching group (p = 0.01 and 0.01, respectively), and the facilitator’s demonstration time was significantly shorter (p = 0.01). The multimodal learner-centered teaching approach evaluated in this study was effective for teaching MPES. It appeared to maximize learner engagement through enhancing interactions and providing increased time to engage in hands-on practice. This teaching approach improved MPES levels, maximized teaching efficiency for scenarios with limited instruction time and resources, and enhanced competency of undergraduate medical students in performing special musculoskeletal physical examinations compared to traditional bedside teaching.

Introduction

Undergraduate clinical training is critical for developing physical examination skills, which are essential for medical graduates to become primary healthcare providers. The development of adequate physical examination skills can allow primary healthcare providers to manage patients with musculoskeletal disorders who do not need specialized care, reducing the burden on hospitals [Citation1,Citation2]. Unfortunately, research on the development of musculoskeletal physical examination skills (MPES) among medical students is limited despite the curriculum-based implementation of clinical teaching [Citation3–11]. Furthermore, undergraduate clinical teaching programs have focused primarily on general screening examination skills than specific physical examination skills [Citation12] – an approach that could contribute to undergraduates’ lack of skills development. Additionally, musculoskeletal screening examinations may be insufficient for diagnosing and managing common orthopedic injuries and disorders without a specialized musculoskeletal examination [Citation13].

Medical graduates must be competent in performing MPES, but these skills require a focused approach and include unique special physical examination tests that are not part of general screening musculoskeletal examinations [Citation14]. In addition, these specialized tests require complex psychomotor skills that integrate multiple stimuli (i.e., tactile, visual, and auditory) as well as practical and theoretical knowledge that cannot be entirely learned through reading or observation [Citation15].

Research suggests using lectures, bedside teaching, peer-assisted learning (PAL), case-based discussions, and simulation-based exercises to teach medical students how to conduct MPES [Citation16–21]. However, the continuing deficits in medical students’ examination skills suggest that no single instruction method is ideal [Citation16–19]. Moreover, several studies have provided novel education methods to help students gain the necessary skills, such as patient educators, multimedia computer-assisted learning, video-based learning (VBL), role-playing, and simulated patients (SPs) [Citation22–25]. However, available data on the effectiveness of these methods of teaching MPES is limited, and any teaching methods for undergraduate medical students is likely to fail if it does not include hands-on practice [Citation23].

Prior research has shown that improving students’ clinical skills requires the integrated and structured implementation of teaching methods that involve prior knowledge activation and briefing, bridging existing and new knowledge, elements of interaction and discussion, and opportunity for reflection and feedback [Citation26–28]. Integrated and structured models can be effective means of teaching clinical skills if they maximize learner engagement, provide meaningful learning contexts, demonstrate the importance of imparting relevant knowledge before the teaching sessions (analogous to the flipped classroom) [Citation29], deconstruct complex skills into small steps, and allow for interactive discussions, hands-on practice, performance assessment, and feedback mechanisms [Citation16]. Yu et al. [Citation30] deconstructed regional joint examinations into an introductory session covering the relevant musculoskeletal anatomy, joint range of motion, and palpation of the basic structures in a region, followed by a second session involving the special tests in MPES for each region. The authors reported significantly improved self-confidence related to MPES in the experimental group. Additionally, the objective structured clinical examination (OSCE) scores significantly improved in musculoskeletal stations with medium to large effect sizes across the different stations [Citation30].

A multisensory approach to teaching clinical skills has been suggested for developing optimal learning [Citation31]. This teaching approach integrates different stimulus mechanisms, such as visual and auditory methods, to increase the recognition and retention of information [Citation31]. Diaz et al. [Citation32] observed that teaching anatomy using extracurricular body painting was successful in engaging, motivating, and inspiring participants and first-year anatomy students to learn surface anatomy and develop their physical examination skills. Recently, Modica et al. [Citation27] described a structured teaching approach for MPES to undergraduate medical students using a web-based musculoskeletal audio-visual tutorial, pathophysiology-focused cases, and facilitator preparation for conducting peer practice sessions. Although no significant differences in the OSCE-based assessment scores were observed, satisfaction and the additional benefit of a persistent resource were perceived by students learning through this teaching method [Citation27].

To address undergraduate medical students’ need to learn MPES, we devised a structured multimodal teaching approach incorporating the benefits of the aforementioned methods. In this approach, students engaged in pre-session instructional VBL and handouts (i.e., flipped classroom), interactive small-group teaching, in-session live demonstrations, hands-on practice, PAL followed by formative assessment, and constructive feedback sessions. These methods were utilized sequentially.

Flipped classrooms, including VBL, can address time constraints since essential knowledge is imparted to the students before the physical session [Citation29,Citation33] and allow medical students to learn at their own pace and utilize class time to work through complex concepts [Citation29]. PAL can effectively address the challenges of SPs for many students by practicing physical examination on peers [Citation34]. Small group discussions provide an individualized approach to active learning compared to large group sessions [Citation35]. A hands-on approach is appropriate for students practicing MPES to ensure they master the required skills [Citation19]. In addition, a constructive feedback mechanism to check the effectiveness of the teaching session is required. Such a mechanism helps strengthen students’ skills and simultaneously address their weaknesses [Citation36]. A formative assessment is a reliable method to assess the students’ skills and provide feedback after concluding a teaching session [Citation24,Citation37].

Students often experience challenges mastering MPES due to the lack of consistent and standardized teaching methods [Citation38]. In addition, the complexity of MPES may not be addressed using a single instructional method. The total time given for classroom instructions is often limited in medical curriculum; thus, the approaches that keep students prepared prior to actual classroom instruction (i.e., flipped classrooms) can save time [Citation29]. Further, student preferences for learning MPES vary considerably with most students preferring hands-on practice on real or simulated patients or peers, followed by multimedia and audio-visual tools [Citation23]. Using multiple teaching modalities can potentially address the weaknesses in each method, which can provide flexibility for different learning preferences and a consistent and standardized plan for teaching MPES. Using a structured multimodal teaching approach as proposed above is one such step in improving MPES. This study aimed to evaluate the effectiveness of the proposed multimodal approach compared with traditional bedside approaches for teaching MPES, based on improvements in MPES competencies, OSCE assessment scores, and the proportion of competent students passing final clinical exams.

Methods

Study design and setting

After receiving approval from the institutional ethical committee (Approval No. 20/0002/IRB), this prospective comparative study was conducted at a medical school in Saudi Arabia, where the undergraduate medical curriculum is taught over seven years and includes a four-week orthopedics course in the fifth year. During those four weeks, the students exclusively receive instruction on orthopedics topics across the knowledge, skills, and attitudes domains. For the orthopedics course, the participants were divided into four groups, including 36–38 students each. These groups were divided into three smaller subgroups of 11–13 participants.

The needs assessment was performed by the medical education and orthopedic departments of the authors’ institution through surveys based on students’ perceptions of different methods of teaching MPES with additional feedback from the orthopedic faculties. A pilot study with 30 participants was conducted to ensure the study’s feasibility, appropriate interpretation, and administrative support .

Participants

Fifth-year male undergraduate students (n = 155) were recruited from the 2016–2017 academic cohort (). Only one gender was examined since the studied institute has separate groups for male and female students in clinical rotation. The study was conducted during the male students’ rotation. Participants who missed the teaching sessions or did not agree to participate were excluded from the study.

Figure 1. Flow of participants among the groups receiving multimodal and traditional teaching approaches. (OSCE: objective structured clinical examination, PE: physical examination).

Figure 1. Flow of participants among the groups receiving multimodal and traditional teaching approaches. (OSCE: objective structured clinical examination, PE: physical examination).

Intervention

The MPES teaching was conducted separately for each smaller subgroup. The standard framework of each MPES session consisted of a general examination (i.e., alignment/gait/inspection/palpation), range of motion assessment, special tests of individual joints with additional focus on communication skills, interpreting physical findings, and the ability to reach the diagnosis.

The duration allotted to physical examination instruction was the same for both groups (i.e., 2 hours) with the only differences being the teaching methodology of the two joint MPES sessions. One clinical session was conducted per day for every subgroup with different subgroups undergoing different joint examination sessions at a time. For the purpose of this study and to maintain uniformity, the same facilitator – an orthopedic consultant who specialized in shoulder and knee surgery – taught all the subgroups the correct knee and shoulder physical examination. Six small-group sessions were conducted for each knee and shoulder examination ().

Control group

In the traditional bedside teaching group, the facilitator provided a bedside demonstration of MPES using the standard framework of general examination (alignment/gait/inspection/palpation), range of motion assessment, and special tests with an SP, followed by interactive discussions involving communication skills and diagnosis formulation and to resolve the participants’ queries. Finally, the participants examined the SP.

Intervention group

The structured multimodal teaching method sequentially combined several learning and assessment methods (), which involved five stages, each of which involved one or more teaching modalities:

Figure 2. The structured multimodal approach for teaching musculoskeletal physical examination skills. (PE: physical examination, SP: simulated patient).

Figure 2. The structured multimodal approach for teaching musculoskeletal physical examination skills. (PE: physical examination, SP: simulated patient).
  1. The Preparatory Stage: This phase used the flipped classroom method and stimulated MPES understanding by reading the study materials and the instructional videos that combined auditory and visual stimuli to improve MPES understanding prior to the physical teaching session. The students received a handout detailing the objectives and the required competencies (i.e., the ability to perform individual components of MPES correctly by the end of the session). It also described the required physical examination tests step-by-step according to the previously described standard framework. The facilitator created two separate instructional online YouTube videos of shoulder and knee examinations lasting 8 and 11 minutes, respectively. These videos covered the steps of general examination, range of motion, and special tests. Three expert orthopedic surgeons validated the video content (Supplementary Digital Files 1 and 2). Students were asked to watch the designated video one day prior to the teaching session and then again at the beginning of the session.

  2. The Demonstration Stage: This stage combined bedside teaching with interactive small-group discussions. The facilitator demonstrated the physical examination step-by-step as described in the video using an SP and answered students’ questions. In addition, the facilitator stressed appropriate patient communication during the examination and diagnosis formulation based upon examination findings.

  3. The Practice Stage: This stage focused on the students’ psychomotor domain while learning to perform MPES individually. The students were paired (PAL) to practice all physical examination steps on each other under the facilitator’s supervision.

  4. The Formative Assessment: This step clarified MPES instructions among the students and ensured every student mastered the MPES special tests. The students were asked to perform special tests on the SP or their peer who acted as an SP. Each student in the subgroup was called by name and asked to perform special tests in front of their peers. If any student failed to perform the tests properly, then another student was called to perform the tests. The failing student was then asked to repeat the steps until they were performed correctly.

  5. Constructive Feedback: In this last stage, each student, facilitator, and peer gave immediate constructive feedback to the other students on their performance. The feedback was based on Pendleton’s model, which involved a structured approach to improve students’ learning [Citation36]. Pendleton’s model is described in four steps. First, the facilitator asks the student what went well and then tells the student what went well based on the facilitator’s observations. Furthermore, the facilitator asks the student what could be improved and then tells the student what could be improved based on the facilitator’s observations [Citation36].

Outcomes

The total teaching time per session, the facilitator’s demonstration time, and the participants’ clinical examination practice time were measured for each group. At the end of the four-week clinical orthopedics course, an OSCE was conducted for each group as part of a summative assessment (main outcome predictor). Each OSCE had two stations comprising a physical examination of the knee and shoulder with time slots of up to six minutes per station. Each participant performed knee and shoulder physical examinations at two stations, and each station had one clinical scenario and a well-trained SP with positive clinical findings.

One assessor per station and a total of eight different blinded assessors evaluated the participants performing the OSCE steps. The assessors were members of the teaching faculty in the orthopedic department (three assistant professors, three associate professors, and two professors selected for OSCE only). The facilitator who conducted the study was not one of the assessors; the assessors were not part of this study.

The OSCE assessment scores were presented in two ways: pre-validated checklist scores and global rating scales. Each station was assessed using a 10-point pre-validated checklist (Supplementary Tables 1 and 2) with individual scores (minimum = 0, maximum = 1) for each component of the physical examination and overall scores ranging from 0 to 10 for 10 components of each physical examination station (Supplementary Tables 1 and 2). The pre-validated checklist graded the participants’ performance at each OSCE component as 0 for ‘not performed/incorrectly performed,’ 0.5 for ‘partially correct/partially performed,’ and 1 for ‘correctly performed.’ The assessors provided global ratings that served as an overall assessment of the OSCE skills, graded from 1 to 5 (1 = fail, 2 = borderline fail, 3 = borderline pass, 4 = clearly pass, and 5 = excellent). The modified borderline standard-setting approach was used to label the participants as having ‘passed’ or ‘failed.’ The overall scores and pass-fail outcomes were compared between the two teaching methods. Additionally, the pass-fail results of the modified borderline approach based on OSCE scores were compared with those based on norm-referenced standard-setting scores, which the medical school routinely used; for the latter, the cutoff percentage score was fixed at 60%. The modified borderline group method was used; it is a criterion-based standard-setting method that is reliable for OSCE scoring of a large group. The available evidence supports the use of criterion-based standard-setting that includes all borderline grouping methods for OSCE assessment [Citation39–41]. These methods are more objective, reproducible, transparent, and have good interobserver reliability. The findings support the use of borderline grouping methods considering their effectiveness in the skills assessment of the undergraduates [Citation39–41].

Statistical analysis

IBM SPSS Statistics for Windows, Version 23.0 (IBM Corp., Armonk, New York, USA) was used for the data analysis. The total teaching time per session, time the facilitator spent demonstrating the clinical examination, time the participants spent practicing the clinical examination, and the checklist- and global-rating-based OSCE scores for both groups were expressed as means (standard deviations [SD]). The total internal consistency of the OSCE scores was measured using Cronbach’s alpha. Since scores were not normally distributed according to the Kolmogorov-Smirnov test, Mann–Whitney U-test was used for independent samples to compare the quantifiable parameters between the teaching groups. An inter-rater reliability analysis using the Kappa statistic was performed to determine consistency among the eight raters. The interrater reliability for the raters showed very good agreement (κ = 0.86, p = 0.10, 95% CI [0.81–0.89]). The proportions of passing participants in each group were compared based on the checklist scores, and those with ‘clearly pass’ and ‘excellent’ global ratings were compared between groups using the chi-square test. Finally, the mean scores at the individual OSCE stations of the knee and shoulder physical examinations using the Mann–Whitney U-test were compared. Statistical significance was set at p < .05.

Ethical approval and consent to participate

This study was approved by the King Saud University institutional review board (Approval Number: 20/0002/IRB). In addition, verbal and written informed consent was obtained from all participants.

Results

A total of 155 students were enrolled in this study. Three students (one in the bedside teaching group, two in the multimodal teaching group) were excluded because of their absence from the teaching sessions, and one student in the multimodal teaching group did not agree to participate in the study. Among the remaining 151 students, 76 were assigned to the traditional bedside teaching group and 75 to the multimodal teaching group. The OSCE checklist items showed high internal consistency (overall Cronbach’s alpha = 0.89).

Teaching time analysis

The overall teaching time per session was significantly higher in the multimodal group than in the traditional group. However, the time spent by the facilitator demonstrating the examinations was significantly higher in the traditional group. In contrast, the time the participants spent practicing clinical examinations was significantly greater in the multimodal group ().

Table 1. Comparison of time division among the two teaching methods-based groups.

Assessment scores

The overall checklist-based and global rating scores were significantly higher in the multimodal group than in the traditional group. A subgroup analysis of the individual checklist scores on the general physical examination portion (inspection, palpation, and range of movement), special tests, communication skills, and the ability to reach diagnosis revealed significant differences between the groups in the combined assessment of the special tests. These differences remained significant for separate assessments of the knee and shoulder physical examinations ().

Table 2. Comparison of OSCE results between the two study groups.

In the norm-referenced approach (cutoff rate 60%), no significant differences were observed in the proportion of overall pass rate of participants in the combined and separate assessments of knee and shoulder physical examinations. However, in the modified borderline approach (cutoff rate 70%), there was a significantly higher proportion of overall passing of participants in the multimodal group than in the traditional group, and significant differences were observed in both the combined and separate assessments of knee and shoulder physical examinations. Regarding global-ratings-based assessments, a significantly higher proportion of students had ‘clearly pass’ or ‘excellent’ global ratings in the multimodal group than in the traditional group in both the combined and separate assessments of knee and shoulder OSCEs. Most students had ‘borderline pass’ global ratings in the traditional bedside teaching group for knee and shoulder assessment separately as well as overall. Conversely, most students had ‘clearly passed’ global ratings in the multimodal teaching group for knee and shoulder assessment separately and overall. and present these results.

Figure 3. (a) Proportion of passing students in the multimodal and traditional teaching groups at the institutional cutoff score of 60% and modified borderline grouping-based cutoff score of 70% for knee and shoulder OSCEs, combined and separately. (b) Global-ratings-based assessment of students among the multimodal and traditional bedside teaching approaches.

Figure 3. (a) Proportion of passing students in the multimodal and traditional teaching groups at the institutional cutoff score of 60% and modified borderline grouping-based cutoff score of 70% for knee and shoulder OSCEs, combined and separately. (b) Global-ratings-based assessment of students among the multimodal and traditional bedside teaching approaches.

Table 3. Significance of proportion-based differences between structured multimodal and traditional bedside teaching groups.

Analysis of OSCE scores for individual joints

Compared with the traditional group, the multimodal group showed significantly higher checklist-based scores for individual OSCE questions in five of six special tests of the knee joint examination (Supplementary Table S1) and four of five special tests for the shoulder examination (Supplementary Table S2). However, no statistically significant differences were found between the groups regarding the general examination, range of motion, communication skills, and diagnosis formulation for both joints.

Discussion

The current findings highlight the advantages of a multimodal teaching method for teaching and learning MPES. Comparisons of the two teaching models showed a notable change in special musculoskeletal physical examination test skills.

The current study findings suggest a shifting paradigm in teaching MPES that moves from traditional bedside teaching to a multimodal approach, similar to how medical education evolved from the traditional Flexnerian model to the competency-based curriculum [Citation42]. Traditional teaching models in medical education include more observation and less participation [Citation43,Citation44]. Additionally, the domains related to communication, doctor-patient relationship, ethics, and professionalism have not been addressed effectively in traditional curricula [Citation45]. MPES teaching methods are likely similar, considering the ample evidence demonstrating that important special tests in musculoskeletal examination are often not performed and remain undocumented by doctors [Citation46–48]. Teaching-learning methods in competency-based curricula should be learner-centered [Citation42]. Furthermore, to make learning more effective, such methods should use multiple modalities to address all three learning domains (i.e., knowledge, skills, and attitude) [Citation49,Citation50]. Fleming et al. [Citation51] suggested that students learn by visual, auditory, reading/writing, and kinesthetic sensory modalities. The fact that most students benefit from more than one learning style has been well established, suggesting the need for a multimodal teaching approach [Citation52,Citation53]. Previous researchers have advocated using multimodal methods to teach clinical competencies [Citation52–54].

In this study, the performance on the general part of the physical examination was satisfactory for both groups, which may be explained by the fact that the psychomotor skills needed for this part were relatively simple. However, there were significant differences between the two groups regarding special test performance because they assess complex psychomotor skills covering all three learning domains. Therefore, the use of multiple modalities can help achieve MPES-related competencies. However, the evidence for a multimodal approach for MPES teaching is limited. There are several methods for medical teaching; however, active learning methods are preferable for competency-based teaching that includes some student participation components [Citation42]. However, combining different learning methods may strengthen students’ learning and address differences in their learning styles [Citation52,Citation55,Citation56]. While the facilitators can include multiple teaching methods for different competencies, those involving the MPES should be tailored to the methods that allow students to perform the desired skills correctly. Practicing on peers/instructors/SPs/real patients has been identified as the preferred learning modality for musculoskeletal learning among medical students; thus, such methods should be incorporated into MPES instruction [Citation23]. Our findings suggest that the proficiency of MPES performance may not be sufficiently addressed using traditional bedside teaching alone. In a traditional curriculum, simply demonstrating the MPES by a teacher with or without individual student participation cannot ensure that all students will perform the MPES in the demonstrated manner and have the chance to perform all required skills correctly. Therefore, traditional bedside teaching is more teacher-centered than learner-centered, as the learner may acquire the knowledge of the examination but may not be able to perform the clinical examination skills as taught. The multimodal approach described in this study is learner-centered and provides a standardized framework for teaching and learning MPES to ensure that all students can perform predefined MPES competencies successfully and efficiently. The different teaching modalities incorporated in the different phases of the multimodal approach were carefully planned to ensure that all students could achieve the desired outcome. Since traditional bedside teaching is a widely used method for MPES teaching worldwide, there is a need to shift to a multimodal structured approach that can potentially help improve MPES performance among medical students.

Our study also suggests that the conventionally used norm reference-based standard settings may not be suitable for MPES assessment. While no significant differences were found between the two teaching groups and the norm reference-based standard (i.e., institutional cut-off based), significant differences were observed with modified borderline-based standard setting. The cut-offs could explained the differences in the two standard setting methods (~60% norm reference, ~70% modified borderline method). The modified borderline approach is a criterion-referenced standard setting based on the minimally acceptable competency level to pass each step in the exam. Comparing the individual components of the checklist-based score, the special test scores were significantly lower in the bedside teaching group than in the multimodal group. Therefore, candidates should be divided into the borderline pass and fail groups by experts’ global ratings to better assess MPES [Citation57]. We used the modified borderline group method that is reliable for the OSCE scoring of a large group, and the mean borderline scores are used to calculate the cut-off scores [Citation39]. A criterion-referenced standard setting compares the performance against a set standard or threshold and is preferable for competency-based learning since the outcomes should be measurable. However, norm-based assessments compare students’ performance with that of their peer groups. Such an assessment may not reflect what a student can or cannot do and cannot determine whether the desired competency outcomes were met. Our study supports the use of criterion-based assessment for MPES assessment since a norm-based standard setting is not accurate when assessing a competency. Concerning this study’s findings, we feel that students’ better special test performance in the multimodal teaching group was because each student had the opportunity to perform all required tests with supervision, unlike the traditional bedside teaching group where the students had voluntary participation only.

Few previous studies have attempted to investigate the role of multimodal teaching in physical examination skills among medical students. Allen et al. [Citation58] analyzed OSCE scores after structured history and clinical examination exercises for back pain evaluation and suggested the lack of new modalities to stimulate audiovisual learning and hands-on practice and feedback mechanisms, which resulted in the suboptimal OSCE performance of medical students. Hands-on practice added to structured teaching, especially with supervision, has been found to be effective for various skills [Citation59]. Modica et al. [Citation27] used a similar approach for musculoskeletal diagnosis- and pathology-focused cases; however, no OSCE score improvements were observed. While there was a component of potentially biased assessment in OSCE scores, the students were satisfied with the structured teaching approach.

Although longer teaching sessions were observed in the multimodal group than in the traditional group, multimodal facilitators spent less than one-fourth of the time demonstrating skills as in the traditional group. Most additional time for the multimodal group was devoted to participants’ hands-on practice, indicating better opportunities for students’ MPES learning. Thus, bedside teaching may be a teacher-centered approach, while multimodal teaching is learner-centered. Traditional methods are predominantly didactic [Citation44]; the learner-centered approach focuses on methods to provide better learning opportunities to students with their active participation [Citation44].

The flipped classroom in the preparatory phase (i.e., VBL and instructional handouts) imparted basic knowledge to participants and outlined the expectations for them during the examinations [Citation29]. This prior knowledge may have made the demonstration easier for the facilitator since students may have been able to adapt most of what they had learned from the VBL [Citation26].

It has been stressed that students’ learning methods vary widely. A well-known VARK adult learning style consists of reading/writing, auditory, and visual learners [Citation50]. The multimodal approach provides an integrated approach that caters to each student’s learning style with different components promoting visual, auditory, and text-based learning. Furthermore, the hand-on segment adds the element of learning by doing, which can potentially boost students’ confidence [Citation19].

The current study using multiple sensory stimulations provides multiple opportunities for students to grasp MPES in their preferred ways of learning. Information from one modality can influence information processing in another modality. Information from different sensory modalities can be combined into a single multisensory event [Citation60]. As far as human learning behavior is concerned, a single sensory teaching methodology may not facilitate optimal learning; multisensory stimulation is important for optimally developing skills and learning [Citation31]. Some studies have suggested improving clinical skills through multisensory stimulation [Citation60,Citation61].

In contrast, students in the traditional group took approximately one-fifth of the time to conduct physical examinations as the multimodal group. One reason for this might be that the participants in the traditional group were less confident about performing the physical examination and feared confrontation with peers or making mistakes while being observed by peers. The multimodal method may help address these participation-related issues with its multifaceted approach.

Specifically, the research showed that providing participants with a preview of the lessons through VBL could strengthen their knowledge and reduce the time required to understand concepts. Strengthened knowledge develops participants’ interest in performing examinations and resolves their doubts [Citation62]. Moreover, studies suggest that VBL can be an effective method for teaching physical examination and clinical skills to large student groups and is perceived well by the students [Citation63,Citation64]. It has proven to be a cost-effective and scalable resource for teaching clinical skills [Citation28,Citation65,Citation66]. Unlike a lecture/demonstration, which progressively moves to new topics, students can view each step repeatedly until they are confident of doing it independently [Citation66]. Easy access to instructional videos can help students retain knowledge, develop self-confidence, practice independently, and reduce errors [Citation28,Citation65]. However, despite its obvious advantages, VBL alone is unlikely to be enough for students to develop MPES skills; they also need hands-on practice. Accordingly, a blended method is proposed rather than a single method based solely on VBL.

Interactive small-group teaching is a learning method that uses a student-centered approach [Citation67]. Learning in smaller groups enhances individual attention, student-teacher interactions, and hands-on experiences [Citation35]. Small-group teaching has been shown to be advantageous for MPES teaching, leading to improved student satisfaction and scores [Citation68,Citation69]. In the present study, each small group had a maximum of 12 participants, which improved each participant’s chance of receiving appropriate attention and getting involved in group activities. For both groups, teaching sessions were conducted using only small-group teaching, which might explain why comparable scores in general physical examination skills were observed between the two groups. The teaching-related modifications introduced by the multimodal approach might also have contributed to the enhanced special test skills observed in the multimodal group.

In PAL, participants assisted each other in teaching and learning in a supportive manner. PAL addresses the limitations due to the unavailability of SP for all sessions as students practice clinical examination steps with their peers [Citation34]. It adds the element of practice and allows each student to build confidence in conducting physical examinations in front of their peers. Several studies have confirmed the effectiveness of PAL in teaching clinical skills [Citation70–72]. Burke et al. [Citation73] and Graham et al. [Citation74] suggested that PAL could effectively improve MPES in medical students and that students could act as effective trainers. The current study’s findings supported this; the scores for special test skills, which generally require more precise understanding and practice from students, were better for the multimodal group in which PAL was an integral component. PAL supports learners’ cognitive, psychomotor, and affective development [Citation75]. Students taught with a multimodal approach might be more comfortable practicing and performing in front of their peers, which should translate to greater patient confidence.

To assess whether the study’s participants achieved each session’s main objectives, formative assessments were used at the end of each session because these were considered an appropriate tool for this study. These formative assessments also provided facilitators and participants with immediate objective feedback. Timely feedback can help teachers and students identify corrective actions to improve learning [Citation76]. Pendleton’s feedback model used in this study ensured the bidirectional flow of information, which helps improve student learning outcomes [Citation36]. Such a feedback process is in line with the concept of feedback literacy. Feedback literacy has a framework of four interrelated features: appreciating feedback, making judgments, managing effect, and taking action [Citation77]. The combination of facilitators’ and students’ observations help in generating constructive feedback. As the feedback was immediate, the students had the advantage of resolving their doubts and correcting any deficits immediately. However, further evidence would be required to understand how the feedback strengthening impacts the students’ performance in MPES.

Unlike specialty residents or trainees, undergraduate medical students’ training and practice are not regular and are bound to change with every specialty-specific training period. The multimodal teaching approach improves students’ practice time and likely helps improve OSCE performance. However, such an approach may not have the same outcomes for specialty trainees, who learn skills through regular practice and have much longer practice exposure. Similarly, those fields that involve more hands-on practice, such as physiotherapy, may not equally benefit from a new approach considering that hands-on training is an integral part of the specialty. For example, in a randomized control group Hossain et al. [Citation78] found no significant increase in knowledge or confidence among physiotherapy students regarding the physiotherapy management of spinal cord injuries by an integrated online module course, and they were not more satisfied with the learning experience.

Limitations

This study had some limitations. First, the design was not randomized and had a planned sequence of teaching group allocations. However, this nonrandomized sequence did not entail preferential group allocation; instead, group allocation was decided by the administrative plan, and students were unaware of their group before participating. Second, the effectiveness of the structured multimodal MPES teaching method on knee and shoulder joint skills after only four weeks of implementation was evaluated – future research should conduct more extensive analyses of implementation feasibility in curricula and its long-term implications for student performance. Third, the OSCE was not conducted concomitantly in either group; the OSCE’s different timings and questions might have impacted the participants’ performance. Fourth, the study could not predict which step of the multimodal teaching approach was specifically responsible for improving students’ performance in OSCE. A combination and sequence of all the involved methods may have contributed to the outcomes.

Fifth, the study evaluated only male students’ skills, and a uniform gender distribution could have different results. The evidence regarding the gender difference in the acquisition of MPES has not been investigated previously. The available evidence regarding clinical examination skills among male and female students suggests differences in their approaches. These differences could be influenced by the gender profile of facilitators, role models, patients’ profiles, type of clinical examinations, and other factors yet to be established [Citation79,Citation80]. While the gender differences cannot be denied, involving students of one gender could potentially standardize these factors among the participants and project the effects of the teaching approach in general.

Lastly, the students in the multimodal teaching groups might have watched examination videos multiple times and practiced clinical examinations. This could potentially have resulted in higher assessment scores in the intervention group. However, increased video watching and practicing would ultimately strengthen the modified approach’s purpose and would benefit the students.

Despite these limitations, this research shows that a structured multimodal teaching approach could be more effective than the traditional bedside method in improving medical students’ substandard performance in MPES. Further studies are required to bolster and expand upon the current evidence.

Conclusions

The described learner-centered, multimodal teaching approach can improve the competency of undergraduate medical students in MPES performance. This structured teaching approach can help improve teaching efficiency considering the limited instruction time and build interest among students since they spend more time practicing than understanding. Although bedside teaching-trained students may remain competent in general physical examination, diagnosis formulation, and communication skills, students trained with the structured approach perform better in musculoskeletal physical examination special tests. A curriculum-based implementation of such an approach can potentially improve MPES among undergraduate students.

Lessons for practice

  • The structured multimodal teaching method was more effective than the single method teaching approach in improving medical students’ MPES.

  • Incorporating VBL, PAL, and hands-on practice in physical examination teaching could maximize learner engagement, provide meaningful learning contexts, improve MPES performance and students’ practice time, and maximize teaching efficiency.

  • Formative assessments followed by constructive feedback are appropriate tools that can be used at the end of a teaching session to assess whether students achieved the required competencies and can help teachers and students identify corrective actions to improve learning.

Ethical approval and consent to participate

This study was approved by the institutional review board of King Saud University, Kingdom of Saudi Arabia. (Approval No. 20/0002/IRB). In addition, informed consent was obtained from the participants (verbal).

Author contributions

AZA created and implemented the multimodal approach, designed the VBL, was the main facilitator for the teaching sessions, collected the data, and wrote the manuscript.

Supplemental material

Supplemental Material

Download Zip (47 MB)

Acknowledgments

The author would like to thank the College of Medicine Research Center, Deanship of Scientific Research, King Saud University, for supporting this project.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Supplementary Material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/10872981.2022.2114134

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

References

  • Altschuler J, Margolius D, Bodenheimer T, et al. Estimating a reasonable patient panel size for primary care physicians with team-based task delegation. Ann Fam Med. 2012;10(5):396–14.
  • Houston TK, Connors RL, Cutler N, et al. A primary care musculoskeletal clinic for residents: success and sustainability. J Gen Intern Med. 2004;19:524–529.
  • Hauer KE, Teherani A, Kerr KM, et al. Student performance problems in medical school clinical skills assessments. Acad Med. 2007 Oct;82(10 Suppl):S69–72. PMID: 17895695
  • Faustinella F, Jacobs RJ. The decline of clinical skills: a challenge for medical schools. Int J Med Educ. 2018 Jul;9:195–197. PMID: 30007951; PMCID: PMC6129153
  • van der Vleuten CP. Competency-based education is beneficial for professional development. Perspect Med Educ. 2015 Dec;4(6):323–325. PMID: 26553242; PMCID: PMC4673061
  • Alman BA, Ferguson P, Kraemer W, et al. Competency-based education: a new model for teaching orthopaedics. Instr Course Lect. 2013 Jan;62:565–569. PMID: 23395058.
  • Freedman KB, Bernstein J. Educational deficiencies in musculoskeletal medicine. J Bone Joint Surg Am. 2002 Apr;84(4):604–608. PMID: 11940622
  • Truntzer J, Lynch A, Kruse D, et al. Musculoskeletal education: an assessment of the clinical confidence of medical students. Perspect Med Educ. 2014 Jun;3(3):238–244. PMID: 24865889; PMCID: PMC4078053
  • Queally JM, Kiely PD, Shelly MJ, et al. Deficiencies in the education of musculoskeletal medicine in Ireland. Ir J Med Sci. 2008 Jun;177(2):99–105. Epub 2008 Apr 15. PMID: 18414969
  • Walker DJ, Kay LJ. Musculoskeletal examination for medical students: the need to agree what we teach. Rheumatology (Oxford). 2002 Nov;41(11):1221–1223. PMID: 12421993
  • Peitzman SJ, Cuddy MM. Performance in physical examination on the USMLE Step 2 clinical skills examination. Acad Med. 2015 Feb;90(2):209–213. PMID: 25406608
  • Doherty M, Dacre J, Dieppe P, et al. The ‘GALS’ locomotor screen. Ann Rheum Dis. 1992 Oct;51(10):1165–1169. PMID: 1444632; PMCID: PMC1012427
  • Almoallim H, Kalantan D, Alharbi L, Albazli, K . Approach to Musculoskeletal Examination. In: Skills in Rheumatology 2021 (pp. 17–65). Singapore: Springer.
  • Hendrick P, Bond C, Duncan E, et al. Clinical reasoning in musculoskeletal practice: students’ conceptualizations. Phys Ther. 2009 May;89(5):430–442. Epub 2009 Mar 27. PMID: 19329773
  • Bloom BS. Taxonomy of educational objectives: the classification of educational goals. New YorkNY: Longmans, Green; 1956.
  • Easton G, Stratford-Martin J, Atherton H. An appraisal of the literature on teaching physical examination skills. Educ Prim Care. 2012 Jul;23(4):246–254. PMID: 22925956
  • Gowda D, Blatt B, Fink MJ, et al. A core physical exam for medical students: results of a national survey. Acad Med. 2014 Mar;89(3):436–442. PMID: 24448049
  • Uchida T, Farnan JM, Schwartz JE, et al. Teaching the physical examination: a longitudinal strategy for tomorrow’s physicians. Acad Med. 2014 Mar;89(3):373–375. PMID: 24448055
  • Danielson AR, Venugopal S, Mefford JM, et al. How do novices learn physical examination skills? A systematic review of the literature. Med Educ Online. 2019 Dec;24(1):1608142. PMID: 31032719; PMCID: PMC6495115
  • McGaghie WC, Issenberg SB, Petrusa ER, et al. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63.
  • Roberts D. Higher education lectures: from passive to active learning via imagery? Active Learn High Educ. 2019;20(1):63–77.
  • Ahmet A, Gamze K, Rustem M, et al. Is video-based education an effective method in surgical education? A systematic review. Epub 2018 Feb 12. PMID: 29449162. J Surg Educ. 2018 Sep–Oct;75(5):1150–1158.
  • Blake T. Teaching musculoskeletal examination skills to UK medical students: a comparative survey of rheumatology and orthopaedic education practice. BMC Med Educ. 2014 Mar;14:62. PMID: 24678598; PMCID: PMC3973615
  • Smith MD, Henry-Edwards S, Shanahan EM, et al. Evaluation of patient partners in the teaching of the musculoskeletal examination. J Rheumatol. 2000 Jun;27(6):1533–1537. PMID: 10852285
  • Humphrey-Murto S, Smith CD, Touchie C, et al.Teaching the musculoskeletal examination: are patient educators as effective as rheumatology faculty?Teach Learn Med.2004;16(2):175–180. Spring;():. PMID: 15276895
  • Spencer J. Learning and teaching in the clinical environment. BMJ. 2003 Mar;326(7389):591–594. PMID: 12637408; PMCID: PMC1125480
  • Modica RF, Thundiyil JG, Chou C, et al. Teaching musculoskeletal physical diagnosis using a web-based tutorial and pathophysiology-focused cases. Med Educ Online. 2009 Published 2009 Sep 28;14:13.
  • Benjamin JC, Groner J, Walton J, et al. A blended curriculum to improve resident physical exam skills for patients with neuromuscular disability. MedEdPORTAL. 2019 Jan;15:10792. PMID: 30800992; PMCID: PMC6354795
  • Bhai SA, Poustinchian B. The flipped classroom: a novel approach to physical examination skills for osteopathic medical students. J Osteopath Med. 2021;121(5):475–481.
  • Yu JC, Guo Q, Hodgson CS. Deconstructing the joint examination: a novel approach to teaching introductory musculoskeletal physical examination skills for medical students. MedEdPORTAL. 2020 Published 2020 Sep 4;16:10945.
  • Shams L, Seitz AR. Benefits of multisensory learning. Trends Cogn Sci. 2008;12(11):411–417.
  • Diaz CM. Beyond the classroom: inspiring medical and health science students to learn surface anatomy. Med Sci Educ. 2022;1–10. DOI:10.1007/s40670-022-01521-0
  • Mir MA, Marshall RJ, Evans RW, et al. Comparison between videotape and personal teaching as methods of communicating clinical skills to medical students. Br Med J (Clin Res Ed). 1984;289(6436):31–34.
  • Outram S, Nair BR. Peer physical examination: time to revisit? Med J Aust. 2008;189(5):274–276.
  • Perrig M, Berendonk C, Rogausch A, et al. Sustained impact of a short, small group course with systematic feedback in addition to regular clinical clerkship activities on musculoskeletal examination skills—a controlled study. BMC Med Educ. 2016 Jan;16:35. PMID: 26821664; PMCID: PMC4731988
  • Pendleton D, Schofield T, Tate P, et al. The consultation: an approach to learning and teaching. Oxford: Oxford University Press; 1984.
  • Mitra NK, Barua A. Effect of online formative assessment on summative performance in integrated musculoskeletal system module. BMC Med Educ. 2015 Published 2015 Mar 3;15:29.
  • Sabesan VJ, Schrotenboer A, Habeck J, et al. Musculoskeletal education in medical schools: a survey of allopathic and osteopathic medical students. J Am Acad Orthop Surg Glob Res Rev. 2018 Jun 28;2(6):e019.
  • Liu M, Liu KM. Setting pass scores for clinical skills assessment. Kaohsiung J Med Sci. 2008 Dec;24(12):656–663. PMID: 19251562
  • Pell G, Fuller R, Homer M, et al. How to measure the quality of the OSCE: a review of metrics—AMEE guide no. 49. Med Teach. 2010 Sep;32(10):802–811. PMID: 20854155
  • Dwivedi NR, Vijayashankar NP, Hansda M, et al. Comparing standard setting methods for objective structured clinical examinations in a Caribbean medical school. J Med Educ Curric Dev. 2020 Dec;7:2382120520981992. PMID: 33447662; PMCID: PMC7780167
  • Carraccio C, Wolfsthal SD, Englander R, et al. Shifting paradigms: from Flexner to competencies. Acad Med. 2002;77(5):361–367.
  • Cooke M, Irby DM, Sullivan W, et al. American medical education 100 years after the Flexner report. N Engl J Med. 2006;355(13):1339–1344.
  • Spencer JA, Jordan RK. Learner centred approaches in medical education. BMJ. 1999;318(7193):1280–1283.
  • Shah N, Desai C, Jorwekar G, et al. Competency-based medical education: an overview and application in pharmacology. Indian J Pharmacol. 2016;48(Suppl 1):S5–9.
  • Lillicrap MS, Byrne E, Speed CA. Musculoskeletal assessment of general medical in-patients–joints still crying out for attention. Rheumatology (Oxford). 2003 Aug;42(8):951–954.
  • Myers A, McDonagh JE, Gupta K, et al. More ‘cries from the joints’: assessment of the musculoskeletal system is poorly documented in routine paediatric clerking. Rheumatology (Oxford). 2004 Aug;43(8):1045–1049.
  • Ahern MJ, Soden M, Schultz D, et al. The musculo-skeletal examination: a neglected clinical skill. Aust N Z J Med. 1991 Jun;21(3):303–306.
  • Prithishkumar IJ, Michael SA. Understanding your student: using the VARK model. J Postgrad Med. 2014;60(2):183–186.
  • Bokhari NM, Zafar M. Learning styles and approaches among medical education participants. J Educ Health Promot. 2019;8:181.
  • Fleming ND. I’m different; not dumb. Modes of presentation (VARK) in the tertiary classroom. In Research and development in higher education, Proceedings of the 1995 Annual Conference of the Higher Education and Research Development Society of Australasia (HERDSA), HERDSA, Rockhampton, Queensland, Australia. 1995 Jul 4 (Vol. 18, pp. 308–313).
  • Prescott GM, Nobel A. A multimodal approach to teaching cultural competency in the doctor of pharmacy curriculum. Am J Pharm Educ. 2019;83(4):6651.
  • Kim RH, Gilbert T, Ristig K, et al. Surgical resident learning styles: faculty and resident accuracy at identification of preferences and impact on ABSITE scores. J Surg Res. 2013;184(1):31–36.
  • Marsh MC, Reed SM, Mahan JD, et al. Advanced multimodal communication curriculum for pediatric residents. J Med Educ Curric Dev. 2021;8:23821205211035239. Published 2021 Oct 4
  • Martin M, Vashisht B, Frezza E, et al. Competency-based instruction in critical invasive skills improves both resident performance and patient safety. Surgery. 1998;124(2):313–317.
  • Prober CG, Khan S. Medical education reimagined: a call to action. Acad Med. 2013;88(10):1407–1410.
  • Shulruf B, Turner R, Poole P, et al. The objective borderline method (OBM): a probability-based model for setting up an objective pass/fail cut-off score in medical programme assessments. Adv Health Sci Educ Theory Pract. 2013;18(2):231–244.
  • Allen SS, Bland CJ, Harris IB, et al. Structured clinical teaching strategy. Med Teach. 1991;13(2):177–184.
  • Blue AV, Stratton TD, Plymale M, et al. The effectiveness of the structured clinical instruction module. Am J Surg. 1998;176(1):67–70.
  • Quak M, London RE, Talsma D. A multisensory perspective of working memory. Front Hum Neurosci. 2015 Published 2015 Apr 21;9:197.
  • Biron VL, Harris M, Kurien G, et al. Teaching cricothyrotomy: a multisensory surgical education approach for final-year medical students. J Surg Educ. 2013;70(2):248–253.
  • Brame CJ. Effective educational videos: principles and guidelines for maximizing student learning from video content. CBE Life Sci Educ. 2016;15(4):es6.
  • Alomar AZ. Undergraduate medical students’ perceptions of an online audio-visual-based module for teaching musculoskeletal physical examination skills. J Med Educ Curric Dev. 2022 Published 2022 Feb 23;9:23821205221078794.
  • Knauber J, König AK, Herion T, et al. “Heidelberg standard examination”—Final year students’ experiences with a handbook and instructional videos to improve medical competence in conducting physical examinations. GMS J Med Educ. 2018 Aug;35(3):Doc38. PMID: 30186948; PMCID: PMC6120156
  • Jang HW, Kim KJ. Use of online clinical videos for clinical skills training for medical students: benefits and challenges. BMC Med Educ. 2014 Mar;14:56. PMID: 24650290; PMCID: PMC3994418
  • Fox G. Teaching normal development using stimulus videotapes in psychiatric education. Acad Psychiatry. 2003;27:283–288.
  • Remesh A. Microteaching, an efficient technique for learning effective teaching. J Res Med Sci. 2013 Feb;18(2):158–163. PMID: 23914219; PMCID: PMC3724377
  • Lawry GVs2nd, Schuldt SS, Kreiter CD, et al. Teaching a screening musculoskeletal examination: a randomized, controlled trial of different instructional methods. Acad Med. 1999 Feb;74(2):199–201. PMID: 10065062
  • Dolmans DH, Wolfhagen IH, Essed GG, et al. The impacts of supervision, patient mix, and numbers of students on the effectiveness of clinical rotations. Acad Med. 2002 Apr;77(4):332–335. PMID: 11953302
  • Haist SA, Wilson JF, Brigham NL, et al. Comparing fourth-year medical students with faculty in the teaching of physical examination skills to first-year students. Acad Med. 1998 Feb;73(2):198–200. PMID: 9484194
  • Perkins GD, Hulme J, Bion JF. Peer-led resuscitation training for healthcare students: a randomised controlled study. Intensive Care Med. 2002 Jun;28(6):698–700. Epub 2002 Apr 24. PMID: 12107673
  • Field M, Burke J, Lloyd D, et al. Peer-assisted learning in clinical examination. Lancet. 2004 Feb;363(9407):490–491. PMID: 14962535
  • Burke J, Fayaz S, Graham K, et al. Peer-assisted learning in the acquisition of clinical skills: a supplementary approach to musculoskeletal system training. Med Teach. 2007 Sep;29(6):577–582. PMID: 17978969
  • Graham K, Burke JM, Field M. Undergraduate rheumatology: can peer-assisted learning by medical students deliver equivalent training to that provided by specialist staff? Rheumatology (Oxford). 2008 May;47(5):652–655. Epub 2008 Mar 17. PMID: 18346975
  • Secomb J. A systematic review of peer teaching and learning in clinical education. J Clin Nurs. 2008 Mar;17(6):703–716. Epub 2007 Nov 30. PMID: 18047577
  • Jain V, Agrawal V, Biswas S. Use of formative assessment as an educational tool. J Ayub Med Coll Abbottabad. 2012 Jul–Dec;24(3–4):68–70. PMID: 24669614
  • Carless D, Boud D. The development of student feedback literacy: enabling uptake of feedback. Assess Eval High Educ. 2018;43:1315–25.
  • Hossain MS, Shofiqul Islam M, Glinsky JV, et al. A massive open online course (MOOC) can be used to teach physiotherapy students about spinal cord injuries: a randomised trial. J Physiother. 2015;61(1):21–27.
  • Sabet F, Zoghoul S, Alahmad M, et al. The influence of gender on clinical examination skills of medical students in Jordan: a cross-sectional study. BMC Med Educ. 2020;20(1). DOI:10.1186/s12909-020-02002-x
  • Wiskin CM, Allan TF, Skelton JR. Gender as a variable in the assessment of final year degree-level communication skills. Med Educ. 2004;38(2):129–137.