3,299
Views
12
CrossRef citations to date
0
Altmetric
Trend Article

A student-initiated objective structured clinical examination as a sustainable cost-effective learning experience

ORCID Icon, ORCID Icon, , ORCID Icon, ORCID Icon & ORCID Icon
Article: 1440111 | Received 11 Nov 2017, Accepted 07 Feb 2018, Published online: 26 Feb 2018

ABSTRACT

Background: The objective structured clinical examination (OSCE) has gained widespread use as a form of performance assessment. However, opportunities for students to participate in practice OSCEs are limited by the financial, faculty and administrative investments required.

Objectives: To determine the feasibility and acceptability of a student-run mock OSCE (MOSCE) as a learning experience for medical students of all 4 years.

Design: We conducted a five-station MOSCE for third-year students. This involved fourth-year students as examiners and first-/second-year students as standardized patients (SPs). Each examiner scored examinees using a checklist and global rating scale while providing written and verbal feedback. MOSCE stations and checklists were designed by students and reviewed by a faculty supervisor. Following the MOSCE, participants completed surveys which elucidated their perceptions on the roles they took during the MOSCE.

Results: Fifty examinees participated in the MOSCE. Of these, 42 (84%) consented to participate in the study and submitted completed questionnaires. Twenty-four examiners participated in the OSCE and consented to participate in the study, with 22 (92%) submitting completed questionnaires. Fifty-three of 60 SPs (88%) agreed to take part in this study, and 51 (85%) completed questionnaires. The internal consistency of the five-station OSCE was calculated as a Cronbach’s alpha of 0.443. Students commented positively on having the opportunity to network and engage in mentorship activities and reinforce clinical concepts.

Conclusions: Examinees, examiners, and SPs all perceived the MOSCE to be a beneficial learning experience. We found the MOSCE to be a feasible and acceptable means of providing additional OSCE practice to students prior to higher-stakes evaluations.

Introduction

The objective structured clinical examination (OSCE) is accepted as a robust method of performance assessment in medical education [Citation1]. Developed in 1975 by Harden and his colleagues [Citation2], the OSCE has since been proven to serve as a reliable, valid, and accurate measure of clinical skills [Citation3Citation5], leading to its use for summative and formative purposes.

While OSCEs serve as a powerful learning opportunity for medical trainees, there are a variety of challenges with implementing OSCEs regularly within medical curricula. For example, there are significant financial costs associated with implementing an OSCE [Citation3]. The majority of expenses often result from compensating standardized patients (SPs), examiners, support staff, and the OSCE lead [Citation6]. Furthermore, even in formative, low-stakes settings, OSCEs can be anxiety-inducing experiences for students [Citation7,Citation8]. In light of these factors, we sought to implement a mock OSCE (MOSCE) wherein the organizing team and all the participants – examiners, examinees, and SPs – consist of students.

The concept of using students as either examiners or as SPs is not a novel one. A recent scoping review on peer-assessment in OSCEs found that using peer evaluators in an OSCE is appropriate in formative settings, promoting learning to all the students involved [Citation9]. Though the literature on student SPs is less extensive, medical students have also been shown to be reliable and cost-effective SPs in formative OSCE settings [Citation10]; furthermore, as with student examiners, Mavis and colleagues [Citation11] reported that student SPs benefited from learning experiences gained during the OSCE encounter.

At our institution, we aimed to develop a sustainable learning intervention to increase opportunities for OSCE practice. Our MOSCE is designed and implemented by students, synthesizing the concepts of students as peer evaluators and as SPs. The primary objective of the present study was to evaluate the feasibility and acceptability of our student-led MOSCE as an educational event for students in all 4 years of the undergraduate medical program.

Methods

Study context and participants

At the University of Ottawa, the undergraduate medical program is of a 4-year duration wherein the first two years, pre-clerkship, are spent on classroom learning and the last two years, clerkship, are spent gaining practical experience in the clinical setting. The Faculty of Medicine currently administers formative OSCEs in Years I, II, and III of the training program and summative OSCEs in Years II and IV. The Year III OSCE introduces clerkship students to management and counselling stations, which are markedly more complex station types than those experienced in pre-clerkship. While formative, the Year III OSCE serves to identify at risk students and thus, borderline or failing performance in this OSCE may result in counselling, closer scrutiny, extra coaching, and/or remediation. However, prior to the Year III OSCE there are no opportunities for students to practice the new complex station types. We identified this gap as a suitable niche for our MOSCE.

University of Ottawa third-year medical students were recruited as examinees to gain exposure to clerkship-level stations prior to their Year III OSCE. Fourth-year volunteers were recruited to act as examiners. First- and second-year volunteers were recruited to be SPs in the MOSCE, due to their medical knowledge and their interest in observing a clerkship-level OSCE. The authors were excluded from participating in the MOSCE.

Students in each of the above groups were signed up on a first-come-first-serve basis and received training as described further below. The same examiners remained throughout all three iterations of the faculty-led OSCE (). The examinees and SPs each took part in only one iteration.

Figure 1. OSCE circuit schematic.

This figure illustrates how a single iteration of the OSCE was structured. During each iteration, four identical circuits with five stations each were run simultaneously as shown (three were done in English, while the fourth was done in French). This allowed for 20 examinees to participate in each iteration. Three identical iterations were carried out sequentially, allowing a maximum of 60 student examinees to participate over the course of the evening.

Figure 1. OSCE circuit schematic.This figure illustrates how a single iteration of the OSCE was structured. During each iteration, four identical circuits with five stations each were run simultaneously as shown (three were done in English, while the fourth was done in French). This allowed for 20 examinees to participate in each iteration. Three identical iterations were carried out sequentially, allowing a maximum of 60 student examinees to participate over the course of the evening.

Study participants were recruited on a voluntary basis from the students taking part in the MOSCE. The study was approved by the Ottawa Health Science Network Research Ethics Board. All study participants provided informed consent. Participant identification numbers were assigned to anonymize data. Data collected from non-consenting students were discarded and not included in analysis.

One of the goals of the student-led MOSCE was to prepare third-year students for the upcoming formative Year III OSCE. Since the year III OSCE at our institution is modelled after the OSCE portion of the Medical Council of Canada Qualifying Examination (MCCQE), we also selected station content to represent the major specialties tested on the MCCQE. Specific topics were chosen from these specialties such that the 5 stations combined would allow for a mix of history-taking, physical examination, counselling, and management stations. Stations for the 2017 MOSCE were as follows: (1) Paediatric cystic fibrosis history; (2) General Surgery cholecystitis management; (3) Family Medicine lower back pain physical exam; (4) Psychiatry depression counselling; and (5) Internal/Emergency Medicine ST elevation myocardial infarction (STEMI) management.

Scoresheets for each station began with a stem containing a description of the clinical scenario and instructions for the examinee. This was followed by a checklist and then a global rating scale (GRS). The GRS used a six-point scale ranging from 1 = inferior to 6 = excellent. Checklists were each conceptualized and written by individual Year II and IV medical students (first to fourth authors), modified by a Year IV medical student (first author), and finally further reviewed and approved by a faculty advisor (last author), an experienced OSCE chief examiner for undergraduate medical education. The review process took seven hours of faculty time.

indicates the organization of the MOSCE. Each station totalled 10 min in length, broken down as follows: 1 min for reading the scenario, 7 min for completing the encounter, and 2 min for receiving feedback from the examiner. Thus, each iteration lasted 50 min. The entire event was coordinated by students without the need for further administrative support.

A few weeks in advance of the MOSCE, SPs and examiners were assigned to specific stations and were sent their cases with detailed instructions. Examinees received information orienting them to the MOSCE schedule and the breadth of specialties and station types they may encounter, but the specific diagnoses of the stations were not divulged. Each group of students received training/briefing on the night of the MOSCE prior to participating in the event. Examiners attended a 30-minute didactic session given by a faculty advisor (last author) that focused on techniques for assessing examinees and giving verbal and written feedback. Examinees were briefly reminded of the station timings and instructed on how to navigate their MOSCE circuits. SPs were asked to only reveal history items or physical examination findings if specifically elicited by the examinee. Examiners and SPs were offered the opportunity to ask questions specific to their stations. After finishing their respective MOSCE activities, each group of study participants had the opportunity to complete surveys regarding their experience as examiners, examinees, and SPs. At this time, examinees received a copy of their station scoresheets as written feedback. All research materials were handled by hired research assistants such that the authors were unable to link research materials to individual study participants.

Measures

Questionnaires were comprised of 11 close-ended questions for examiners, 12 for examinees, and 9 for SPs. Close-ended questions consisted of statements with response options arranged in a five-point Likert scale ranging from ‘Strongly Disagree’ (1) to ‘Strongly Agree’ (5). In the examinee questionnaire, close-ended questions were adapted from Moineau and colleagues [Citation12] (see ). For examiner and SP questionnaires, close-ended questions were based on questionnaires designed by Burgess and colleagues [Citation13] (see and , respectively). In addition, each questionnaire contained three open-ended questions that elicited perceptions on the involvement of all 4 years of medical students, ways to improve the MOSCE, and on the MOSCE in general.

Table 1. Student examinee perceptions.

Table 2. Student examiner perceptions.

Table 3. Student standardized patient perceptions.

Examinees were asked to provide a self-assessment score by rating their performance on a six-point Likert Scale ranging from ‘Inferior’ (1) to ‘Excellent’ (6), corresponding to the examiner GRS. Completed checklist items were summed and expressed as a percentage of total checklist items as an objective indicator of performance to be compared with the GRS score awarded by the examiner. However, these data will not be discussed in this report.

Data analysis

Mean scores and percentages of responses to close-ended questions on questionnaires were calculated. The responses to open-ended questionnaire items were coded and analysed thematically by two authors and any discrepant items were discussed and resolved by consensus with two other authors. Cronbach’s α was measured to ensure there was some consistency within the construct of the student-developed MOSCE. The α for the MOSCE as a whole was calculated using the mean proportion checklist completion scores of each of the five stations. Statistical analyses were conducted using SPSS version 24.

Results

Participants

A total of 50 examinees participated in the MOSCE. Of those who participated, 47 (94%) consented to be a part of this study and 42 (84%) submitted completed questionnaires. Twenty-four examiners participated in the MOSCE and all consented to take part in the research component, with 22 (92%) submitting completed questionnaires. Further, there were a total of 60 SPs, 53 (88%) of whom agreed to take part in this study, and 51 (85%) of whom completed questionnaires.

Reliability analysis

The internal consistency of the five station MOSCE as a whole was calculated as a Cronbach’s alpha of 0.443 based on the mean proportion checklist completion scores of each station. This value is interpreted as having low reliability [Citation14].

Responses to close-ended questions

Examinees

Most examinees found the MOSCE to be a positive experience and learning environment, as shown in . They felt comfortable being evaluated by their peers for formative purposes (M = 4.24, SD = 1.15) but did not feel it would be acceptable for summative purposes (M = 2.54, SD = 1.41). Examinees perceived that their examiners provided constructive (M = 4.57, SD = 0.68) and appropriate feedback (M = 4.70, SD = 0.58). Many did not prepare for the MOSCE (M = 1.70, SD = 1.12) but still felt there was value in peer-evaluated OSCEs (M = 4.46, SD = 0.68). Most students disagreed that there was tension between themselves and their examiners that would affect their evaluation (M = 1.59, SD = 0.92).

Examiners

Mean scores (M), standard deviations (SD), and maximum and minimum values for examiner responses to each questionnaire item are summarized in . Examiners agreed that participating in the MOSCE allowed them to apply (M = 4.68, SD = 0.47) and build (M = 4.09, SD = 0.73) on prior knowledge. However, they only moderately agreed that it challenged their prior knowledge (M = 3.27, SD = 0.96) and developed their clinical skills (M = 3.77, SD = 1.00). Examiners perceived the MOSCE as a helpful learning activity for future examinations (M = 4.55, SD = 0.58). They also felt adequately prepared (M = 4.68, SD = 0.55) and sufficiently confident to assess student performance (M = 4.50, SD = 0.66) and provide feedback (M = 4.59, SD = 0.65).

Standardized patients

The first- and second-year medical students who served as SPs also perceived several benefits from participating in the MOSCE (). SPs agreed that the MOSCE allowed them to apply (M = 4.27, SD = 0.88) and build upon their prior knowledge (M = 4.55, SD = 0.60). They perceived the MOSCE to be helpful in the development of their clinical skills (M = 4.33, SD = 0.83) and in preparing them for future examinations (M = 4.63, SD = 0.65). They also felt adequately trained in their role as SPs (M = 4.47, SD = 0.64).

Responses to open-ended questions

Overall, the MOSCE was very well received by the student body. The feedback collected following the event indicated benefits for all the groups of participants involved: fourth-year examiners, third-year examinees, and first- and second-year SPs. These benefits could be grouped into several general categories: teaching, learning, and collegiality. Feedback also highlighted some concerns regarding OSCE characteristics, training, and awkward interactions ().

Table 4. Summary of major themes that emerged from student comments with illustrative quotes.

Teaching

While all examiners surveyed provided positive remarks (e.g., ‘great experience,’ ‘It was a pleasure to participate’), three examiners specifically commented on the valuable teaching opportunity they had while participating in the MOSCE. Four examiners commented on how being an examiner helped reinforce concepts and reflect on how much they have developed.

Learning

Notably, one examiner commented on how the students involved ‘all learned something appropriate to their level.’ This was echoed by 12 SPs who specified how they enjoyed learning more about OSCEs from interacting with their third- and fourth-year colleagues. Most importantly, examinees appreciated being exposed to novel station types (i.e., management stations) in a low-stakes setting, saying that they’d rather ‘struggle with it now than on an OSCE for marks.’ They also appreciated the feedback that their fourth-year colleagues provided them. The immediate return of scoresheets after completion of all five stations was well received.

Collegiality

Examiners, examinees, and SPs appreciated the opportunity to network and collaborate with students in all 4 years of the programme. Examiners and SPs valued the collegiality they experienced when ‘all years come together to help each other.’ As one examiner stated, ‘It was a pleasure to be involved in something that benefits the entire Faculty of Medicine.’

OSCE characteristics

General feedback regarding OSCE characteristics consisted of examinees and examiners commenting that they would have liked more time for feedback. Examinees also requested more stations in the MOSCE for practice.

Training

An examiner recommended improved training on the scoring system, specifically suggesting ‘more teaching on [the] global assessment for borderline satisfactory vs. borderline unsatisfactory.’ Examinees and examiners noted that some examinees were unsure of how to approach management stations, and/or how to use the standardized nurse present in one of these stations. Another suggestion by three SPs and two examiners was to provide better training to SPs prior to the MOSCE, given that some of the students seemed to be unfamiliar with how to act out certain physical findings and how much information to reveal during history taking stations.

Awkward interactions

Whereas examinees generally disagreed that there were tensions between them and student examiners, two examinees reported that they experienced an awkward interaction when their examiner was someone they were familiar with (it was awkward when I knew one of the examiners personally).

Discussion

The objective of our study was to assess the feasibility and acceptability of our student-initiated peer-assisted MOSCE. Our quantitative and qualitative data suggest that both examinees and non-examinees perceived several benefits from participating in the MOSCE. First- and second-year students observed examinees while reflecting on their own clinical skills, whereas fourth-year students gained important experience in a teaching role before their transition into residency.

All three groups of students noted how the MOSCE was a unique evening for networking and collaboration between students from every year in the undergraduate medical programme. First- and second-year students appreciated networking with fourth-year students, who are nearing the end of the program and have many insights to share. We believe that the strong collegiality observed was a major contributor to the acceptability of the MOSCE.

From a financial perspective, the MOSCE proved to be more cost effective compared to a traditional faculty-led OSCE, the latter costing around $96,000 CAD ($75,000 USD) for a class size of 166 students at our institution, or $578 CAD ($450 USD) per examinee [Personal Communication, Ottawa Examination Centre]). In contrast, the costs of organizing and implementing this MOSCE with the capacity to accommodate 60 examinees totalled roughly $600 CAD ($470 USD), or $10 CAD ($8 USD) per examinee, the vast majority of which was spent on providing dinner to participants. As resource stewardship becomes increasingly critical to healthcare and healthcare education, our MOSCE serves as an example of a creative strategy to meet a learning need in a cost-effective way.

While we recognize the benefits of this MOSCE, it is also important to address concerns raised by students regarding training, OSCE characteristics, and awkward interactions. In terms of training, there were comments pertaining to each of the three roles in the MOSCE. First, examiners expressed interest in learning how to better score borderline candidates. Although the MOSCE was completely formative in nature, enhanced examiner training would likely be beneficial in the future, both for examinees seeking quantitative and qualitative feedback and for examiners developing their skillset as teachers. Second, examinees reported being unaware of what a management station entailed. Exposure to this type of station is quite limited in Year I and II OSCEs; however, the Faculty of Medicine may be persuaded to offer further support of a dedicated MOSCE in the future devoted to strictly management scenarios. Third, examiners noted that some SPs were unfamiliar with how much history to divulge or how to demonstrate physical exam findings. This suggests that sending written instructions and providing a brief orientation were insufficient in preparing first- and second-year students to be SPs in the MOSCE; implementing more formalized training similar to the 30-minute model used by Mavis, Ogle, Lovell, and Madden [Citation10] may improve the consistency in SP performance in future iterations.

With regards to OSCE characteristics, students in each of the three roles suggested that there was an inadequate amount of time for verbal feedback post-encounter, although the time allocated reflected that of the faculty-led OSCEs. There were also requests for a greater number of stations. These findings reflected the keen interest students had in getting more practice and receiving/giving more feedback in the context of this student-led initiative. It is also possible that fourth-year students were less efficient at giving verbal feedback than experienced faculty, and it may be worthwhile to train them in focusing on key points to communicate to examinees in a limited timeframe. With regards to the MOSCE cases themselves, there was one comment from an examiner who felt that the checklist for their station was lacking. While the student-designed checklists allow for the return of the scoresheets to examinees, they are likely of lower quality than cases that have been designed by experienced faculty, piloted, and subjected to rigorous statistical analyses. Hence, this reinforces that the use of this student-initiated peer-OSCE construct is most appropriate for low-stakes formative situations.

Finally, although most examinees disagreed that there was tension between themselves and their examiners, two students reported that being evaluated by near-peers with whom they had personal relationships made for an awkward interaction. Cushing and colleagues [Citation15] have also described mixed findings in this regards; in their OSCE, some examinees felt that having their classmates be their peer assessors helped them relax whereas others perceived increased pressure. In a study by Moineau and colleagues [Citation12], the authors administered a post-OSCE questionnaire to second-year examinees that was adapted for use in the present study; they found that most examinees did not perceive tension with their fourth-year student examiners, and furthermore there were no comments from examinees to suggest the contrary. It is possible that larger gaps in training level between examinees and examiners decrease the potential for awkward or tense interactions. Although we could entertain the possibility of recruiting residents instead of fourth-year students to act as examiners in our MOSCE, we believe that this would detract from the collegiality and unique learning opportunities inherent in this student-led initiative. All in all, the occasional awkward interactions inherent in this type of OSCE do not eclipse its benefits to students, but should serve to reinforce previous recommendations limiting this construct to a low-stakes, formative setting [Citation9].

The Cronbach’s alpha calculated for the MOSCE as a whole was 0.443. An alpha of greater than 0.7–0.8 is considered to be necessary for a formal OSCE held for the purpose of evaluation [Citation3]. To the best of our knowledge, no previous authors have reported measures of internal consistency for peer-assisted OSCEs. The purpose of calculating the alpha for this present study was to assess whether or not there was some degree of internal consistency in our construct, with the understanding that factors were present which would increase the variability in examinee performance from one station to the next (for example, each examinee was only halfway through a year’s worth of core clinical rotations, which they experienced in a different order than many of their peers). The small number of stations and the variety of competencies being assessed may have contributed to the lower alpha as well [Citation16]. Since the goal of the MOSCE was not to evaluate or discriminate between high- versus low-performing students, achieving a high alpha was less of a focus in the present study.

Our study is not without limitations. It should be noted that the sample of students who participated did so voluntarily, hence their motivation to learn in this peer-assisted context may have contributed to the amount of positive feedback received. Furthermore, this group of students was from a single site, limiting generalizability. Finally, as data gathering was limited to the night of the MOSCE, it is unknown whether examinees still perceived the MOSCE as beneficial after undergoing the year 3 faculty-led OSCE.

This study has brought to light some unanswered questions to be addressed with future research. Although this OSCE is one of several that has used students as examiners, it is not known whether this early exposure to the assessor role prepares students to be assessors in OSCE or non-OSCE settings in their residency. Furthermore, it would be interesting to evaluate the efficacy of this MOSCE in improving the performance of both examinees and SPs in their future summative faculty-led OSCEs.

Conclusions

This student-initiated MOSCE was highly accepted among fourth-year examiners, third-year examinees, and first- and second-year SPs, and served as a unique learning and mentorship opportunity for all students in the MD program at our institution. Its cost-effectiveness ensures that it will continue to represent a feasible means of preparing third-year students for higher-stakes clerkship OSCEs. Comments regarding the learning benefits perceived by SPs and examiners highlight the potential utility of these roles as opportunities for learning in an OSCE. Student participation in future iterations of this MOSCE could be correlated with performance in faculty-led OSCEs to better determine its efficacy as a preparatory activity.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by the Ottawa Blood Diseases Centre [Grant number: OHCO 973 107 000].

References

  • Sloan DA, Donnelly MB, Schwartz RW, et al. The objective structured clinical examination. The new gold standard for evaluating postgraduate clinical performance. Ann Surg. 1995;222(6):735–7.
  • Harden RM, Stevenson M, Downie WW, et al. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447–451.
  • Harden RM, Lilley P, Patrício M. The Definitive Guide to the OSCE. Edinburgh: Elsevier; 2016.
  • Walters K, Osborn D, Raven P. The development, validity and reliability of a multimodality objective structured clinical examination in psychiatry. Med Educ. 2005;39(3):292–298.
  • Grand’Maison P, Blouin D, Briere D. Utilization of the objective structured clinical examination (OSCE) in gynecology/obstetrics. In: Proceedings of the Annual Conference on Research in Medical Education. 1985. p. 65–71.
  • Reznick RK, Blackmore D, Cohen R, et al. An objective structured clinical examination for the licentiate of the medical council of Canada: from research to reality. Acad Med. 1993;68(10 Suppl):S4–6.
  • Allen R, Heard J, Savidge M, et al. Surveying students’ attitudes during the OSCE. Adv Heal Sci Educ. 1998;3:197–206.
  • Escovitz ES. Using senior students as clinical skills teaching assistants. Acad Med. 1990;65(12):733–734.
  • Khan R, Payne MWC, Chahine S. Peer assessment in the objective structured clinical examination: a scoping review. Med Teach. 2017;39(7):745–756.
  • Mavis BE, Ogle KS, Lovell KL, et al. Medical students as standardized patients to assess interviewing skills for pain evaluation. Med Educ. 2002;36(2):135–140.
  • Mavis B, Turner J, Lovell K, et al. Faculty, students, and actors as standardized patients: expanding opportunities for performance assessment. Teach Learn Med. 2006;18(2):130–136.
  • Moineau G, Power B, Pion A-MJ, et al. Comparison of student examiner to faculty examiner scoring and feedback in an OSCE. Med Educ. 2011;45(2):183–191.
  • Burgess A, Clark T, Chapman R, et al. Medical student experience as simulated patients in the OSCE. Clin Teach. 2013;10(4):246–250.
  • Cohen L, Manion L, Morrison K. Research methods in education. 7th ed. Abingdon: Routledge; 2011.
  • Cushing A, Abbott S, Lothian D, et al. Peer feedback as an aid to learning–what do we want? Feedback. When Do We Want It? Now! Med Teach. 2011;33(2):e105–12.
  • Tavakol M, Dennick R. Making sense of Cronbach’s alpha. Int J Med Educ. 2011;2:53–55.