814
Views
3
CrossRef citations to date
0
Altmetric
Research Article

Incorporating multi-source feedback into a new clinically based revision course for the FRCS(Plast) exam

, , &
Pages e263-e266 | Published online: 25 Apr 2011

Abstract

Background: Exit exams for completion of surgical training are demanding and have relatively low pass rates with many candidates requiring multiple attempts.

Aim: To establish a new, clinically based exam preparation course, utilising multi-source feedback, to identify candidates at risk of failure and improve pass rates.

Methods: We describe the process of establishing a new, unique, clinically based exam preparation course incorporating multi-source feedback from examiners, patients, nurses and other trainees. We present the course results as well as the exam results for each candidate and analyse the results of the multi-source feedback.

Results: Nine candidates have so far successfully completed both the preparation course and the FRCS(Plast) exam. Success in the exam preparation course accurately predicts success in the FRCS(Plast) exam. Nursing staff and patients tend to give higher scores than examiners and trainees. The majority of marginal failures from the course went on to pass the exam, indicating that the course allows candidates to successfully address weaknesses identified on the course.

Conclusion: A clinically based exam preparation course utilising multi-source feedback allows identification of candidates at risk of failing a surgical training exit exam and allows targeted training in order to maximise pass rates.

Introduction

Exit exams at the completion of higher surgical training are a recent intercollegiate innovation and the relatively low pass rates attest to how demanding and challenging these assessments have become. The FRCS(Plast) exam, taken at the end of higher surgical training in plastic surgery, had overall pass rates of just 52% and 67% in September 2008 and March 2009, respectively (Intercollegiate Specialty Board and Joint Committee on Intercollegiate Examinations website Citation2006). The proportion of successful candidates is similar in other subspecialties and is even lower for the first time applicants. The reasons for this are multi-factorial, but the misassumption that the exit exam is a test of knowledge rather than one of maturity, decision making, and communication is an important factor. Lack of guidance and vagueness about the level of standards expected have recently been clarified by a statement that the standard expected is that of a first day consultant.

There are various revision courses available to help candidates to prepare for the clinical part of the FRCS(Plast) exam. Most of these consist mainly of mock viva style exams with limited clinical cases. The senior author felt that a new course was needed to emphasise and reflect standards required of a first day consultant and to recreate the format of the actual exam. The aim was that this format should be experienced repeatedly so that the final exam would be less daunting and that the course should help to benefit all trainees within the region and not only those sitting the exam. Candidates should also be provided with immediate verbal and multi-source feedback from examiners, peer group, nurses as well as patients themselves.

We describe the process of establishing a new clinically based exam preparation course aimed to help trainees to prepare for Part 2 of the FRCS(Plast) exam. The feedback from the course participants to date has been very positive and we believe the format may be successfully used in other specialties.

Methods

The course was designed to reflect the actual conditions of the exam as closely as possible. Every candidate completed four ‘stations’, each lasting for 30 min. One station consisted of viva style structured interviews based around photographs of various clinical conditions. One station consisted of long cases and the final two stations were each made up of five short cases. Each candidate rotated through the same stations with the same examiners in order to maximise reliability of scoring.

Actual patients were used and were chosen to represent a wide range of conditions from the plastic surgery curriculum and to accurately reflect the clinical practice of a first day consultant. Each station was examined by two experienced consultants. All candidates were shadowed by two to three other trainees at all times and by a nurse at the clinical stations.

Timings allowed for each station mirror those of the exam and were strictly adhered to. Being shadowed by trainees from within the candidate's peer group added to the pressure in a deliberate attempt to recreate some of the stress and anxiety which will be experienced during the exam. The shadowing trainees can also experience and appreciate some of this pressure.

The candidates were given a score for each station by the examining consultants and the other trainees according to the marking schedule published by the Joint Committee on Intercollegiate Examinations (JCIE), summarised in . The consultants and trainees were asked to give a score for the domains of professional capability, knowledge and judgement, communication and responses, and bedside manner, compared to the level of a first day consultant. This was used to produce an average score for each patient/scenario, and a final overall score for each station. Each patient and their accompanying nurse also gave the candidate a score based on a five-point Likert scale within four domains covering confidence, appearance, bedside manner and overall performance. Again, it was emphasised that the standard required was that of a first day consultant. In addition, each group was encouraged to add additional comments as appropriate on the performance of the candidate to allow more detailed feedback.

Table 1.  Summary of JCIE marking descriptors.

Each candidate was given initial feedback from the examining consultants immediately after each station, followed by a more detailed breakdown of the scores from each group (consultants, trainees, patients and nurses) on an individual basis after the course.

The one-day course consisted of the mock exam in the morning followed by a feedback session in the afternoon. The afternoon session was used to discuss topics from the mock exam in detail and provide individual feedback on answers from each candidate in a supportive and non-confrontational environment. Both consultants and trainees were involved in the afternoon feedback session, thus continuing the theme of multi-source feedback. There were also lectures from the consultant examiners on key exam topics and exam technique and preparation.

Results

Scores were obtained from consultants, trainees, patients and nurses for each candidate and used to calculate an average score for every station. An overall average score of 6 or more is required to pass the exam, and this score was also used as the pass mark for this course.

Trainees, nurses and patients all consistently gave higher scores than the consultant examiners. Trainees tended to most accurately reflect the consultant score, showing a greater appreciation of the level of knowledge required for the exam compared to both nurses and patients. On an average, the trainees gave a score which was 0.4 points higher than that of the consultants whilst the nurses and patients gave scores which were 0.8 points and 1.0 points higher, respectively ().

Table 2.  Differences in scores between different groups.

For the purposes of predicting exam success, only the scores from the consultants were used, as it was felt that these would most closely represent actual exam performance. However, the scores from trainees, patients and nurses were also used to provide more detailed and individualised feedback to the candidates in a similar method to 360° feedback.

The course has run for 2 years and so far 10 candidates and over 30 trainees have experienced the format. Nine candidates have taken the FRCS(Plast) exam at the next available sitting. The exam course accurately predicted actual exam results in six of the nine candidates. The remaining three candidates passed the exam despite scoring less than six on the exam preparation course; this may be due to the feedback from the course which allowed intensive and focused revision in certain areas before the exam. shows predicted and actual exam results for each candidate.

Table 3.  Actual and predicted exam results of candidates.

Participants were asked to provide feedback on the exam preparation course. All trainees and candidates agreed that the course had left them better prepared for the FRCS(Plast) exam; 14 strongly agreed with this statement and eight agreed.

Discussion

The exit exam at the end of higher surgical training requires significant commitment from the candidates in terms of both time and money, and it is important that trainees should be encouraged to take the exam only when they are adequately prepared to maximise the chance of first time success. The significant and important difference between this exam and all previous exams encountered by the candidate is that the exit exam is for independent practice. The emphasis is on a mature approach expected of a future colleague and not merely a good trainee.

Several authors have previously attempted to identify students at risk of failing undergraduate or postgraduate exams, but no method has been shown to be completely reliable (McManus et al. Citation2003; Bessant et al. Citation2006; Dewhurst et al. Citation2007; Bowhay & Watmough Citation2009,). White et al. (Citation2009) showed that students who fail an exam can improve their success at the next attempt with a combination of review, reflection and self-assessment. Regular mock exams have been shown to improve pass rates in American Board Certifying exams, and participants have reported improvements in clinical reasoning and self-study after such a programme (Guzman et al. Citation2008).

This course aims not only to prepare candidates for the FRCS(Plast) exam but also to identify those at risk of failing the exam by highlighting areas of weakness through individualised multi-source feedback. The course is timed 4 weeks prior to the actual exam so that the candidates have time to go through the process of review and reflection before exam, therefore hopefully improving the chance of first time success.

The revision course uses well-established assessment methods such as structured oral and clinical examinations and emphasises assessment of candidates at the ‘shows how’ level of Miller's pyramid (Miller Citation1990). Although such methods are recognised to have potential drawbacks in terms of validity and reliability, they accurately mirror the format of the exit exam and allow assessment of relatively large numbers of candidates using the resources available to a regional training programme.

Multi-source feedback is now an established method of assessment and feedback in the medical education of junior doctors and has been proven to be valid within the setting of a national assessment programme (Archer et al. Citation2008). Use of such feedback is important for reflective practice and development and has been shown to initiate change in a group of surgeons (Violato et al. Citation2003). Uniquely, this course used scores from peers and senior medical staff as well as nurses and patients to provide the candidate with detailed feedback. Multi-source feedback has not been described before in this format in the setting of a postgraduate clinical exam. We found that the scores given by nurses and patients were consistently higher than those given by consultants and trainees. This may reflect the greater awareness of the standard of the exam by trainees compared to nurses and patients. Nursing staff gave more importance to practical tasks such as hand hygiene and introductions, whereas patients placed more emphasis on the confidence and appearance of the candidates. Previous studies have shown that rating behaviours vary by staff group, and this should be taken into account when summarising scores (Bullock et al. Citation2009).

We believe that this exam preparation course has the potential to improve pass rates for the FRCS(Plast) exam for several reasons. The emphasis of the course is to set standards at the level of a first day consultant and to change the mindset from that of a good trainee. Candidates who participate in the course experience an intensive and realistic mock exam with detailed and individualised feedback. The format of the course is designed to reflect that of the actual exam and the marking descriptors used are identical. The course is entirely clinically based and uses real patients rather than actors for all clinical stations. Having trainees as ‘observers’ is, we believe, an excellent and innovative way to give experience to a lot more candidates than is otherwise possible with limited time and examiners.

Those candidates who are identified as being at risk of failing the exam have adequate time before the actual exam for a period of reflection and targeted revision in order to reach the required standards. Finally, trainees who shadow the candidates have the opportunity to commence exam preparation at an earlier stage so that by the time they are ready to sit the exam themselves, the format and standard of the exam and type of cases which will be seen are familiar to them.

The course is now in its third year and is an established part of the regional training programme within our region. The feedback from the course so far has been very positive but the numbers of candidates who have completed the course so far is still relatively small, and therefore it is not possible to generalise from the results we have reported. The study is ongoing and we plan to collect more meaningful data which will be reported again at a later date. We are interested to see whether candidates who are not successful in their first exam attempt are able to improve their scores in subsequent courses following focused multi-source feedback in addition to traditional revision methods. The aim of reporting the raw data gathered so far is to describe what we believe to be a unique method of teaching and feedback which may be adopted within other specialties.

Conclusion

We describe the launch of a new, clinically based exam preparation course for the FRCS(Plast) exam. We believe that the format of the course as well as the use of multi-source feedback in conjunction with standard exam scoring techniques is unique and may be successfully used in other specialties. Feedback from candidates and trainees is very positive. Success in the exam preparation course accurately predicts success in the FRCS(Plast) exam. Encouragingly, the majority of the marginal failures at the course went on to pass the exam at the first attempt, indicating that candidates can successfully address weaknesses highlighted on the course.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the paper.

References

  • Archer J, Norcini J, Southgate L, Heard S, Davies H. Mini-PAT (Peer Assessment Tool): A valid component of a national assessment programme in the UK?. Adv Health Sci Educ Theory Pract 2008; 13(2)181–192
  • Bessant R, Bessant D, Chesser A, Coakley G. Analysis of predictors of success in the MRCP (UK) PACES examination in candidates attending a revision course. Postgrad Med J 2006; 82(964)145–149
  • Bowhay AR, Watmough SD. An evaluation of performance in the UK Royal College of Anaesthetists primary examination by UK medical school and gender. BMC Med Educ 2009; 9: 38
  • Bullock AD, Hassell A, Markham WA, Wall DW, Whitehouse AB. How ratings vary by staff group in multi-source feedback of junior doctors. Med Educ 2009; 43(6)516–520
  • Dewhurst NG, McManus C, Mollon J, Dacre JE, Vale AJ. Performance in the MRCP(UK) examination in 2003-4; analysis of pass rates of UK graduates in relation to self-declared ethnicity and gender. BMC Med 2007; 5: 8
  • Guzman E, Babakhani A, Maker VK. Improving outcomes on the ABS Certifying Examination: Can monthly mock orals do it?. J Surg Educ 2008; 65(6)441–444
  • Intercollegiate Specialty Board and Joint Committee on Intercollegiate Examinations 2006. [Internet] Section 2 summary statistics. [Accessed 22 March 2010]. Available from: http://www.intercollegiate.org.uk/Content/content.aspx?ID=29.
  • McManus IC, Smithers E, Partridge P, Keeling A, Fleming PR. A Levels and intelligence as predictors of medical careers in UK doctors: 20 year prospective study. Br Med J 2003; 327(7407)139–142
  • Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65(Suppl 9)S63–S67
  • Violato C, Lockyer J, Fidler H. Multisource feedback: A method of assessing surgical practice. Br Med J 2003; 326(7388)546–548
  • White CB, Ross PT, Gruppen LD. Remediating students’ failed OSCE performances at one school: The effects of self-assessment, reflection and feedback. Acad Med 2009; 84(5)651–654

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.