3,076
Views
12
CrossRef citations to date
0
Altmetric
Research Article

Workplace-based assessment as an educational tool: Guide supplement 31.3 – Viewpoint

, &
Pages e369-e372 | Published online: 26 Aug 2010

Introduction

Workplace-based assessments (WBAs) are being increasingly used during postgraduate medical training as a method of assessing competence. Southgate (Citation1999) defined competence in a doctor as being ‘composed of cognitive, interpersonal skills, moral and personality attributes. It is in part the ability, in part the will, to consistently select and perform relevant clinical tasks in the context of the social environment in order to resolve health problems of individuals in an efficient, effective economic and humane manner’. There was previously a concern that trainees were infrequently observed, assessed and given feedback during their workplace-based education. This has led to an increasing interest in a variety of formative assessment methods that require observation and offer the opportunity for feedback (Norcini & Burch Citation2007). There are many methods including direct observation of procedural skills (DOPS), mini-clinical evaluation exercises (mini-CEX) and case-based discussions (CBD). These methods are used in ophthalmology, for example, where the curriculum for ophthalmic specialty training consists of 180 defined learning outcomes in 13 domains of clinical practice, each of which can be mapped to the General Medical Council's (GMC) description of good medical practice.

Aims and objectives of WBAs

Traditionally, training was defined in terms of time spent in training and in different clinical posts or attachments. It was assumed that learning occurs naturally as part of routine clinical work. There was no organised educational programme with clear objectives. The involvement by senior doctors was unstructured and haphazard and minimum attention has been paid to the educational needs of the trainee (Holm Citation2002). There has been a move in the recent years towards competency-based medical training because patients expect doctors to diagnose, plan management, carryout practical procedures and behave in a reasonable way demonstrating a caring and humanistic attitude (Carr Citation2004). WBAs have been introduced in the recent years to demonstrate this. The main aims of WBAs are to aid learning through objective feedback and to provide evidence that the competencies required to progress to the next level of training have been achieved (Beard Citation2008). The recent change in working patterns of doctors in training has meant that the traditional systems of education are under increasing pressure and that there is the need to maximise new opportunities for learning (Carr Citation2006). Observing trainees during their daily practice is a time-efficient way of determining their level of competence. By giving great importance to the concept of feedback, trainees are encouraged to reflect upon the learning experience and can therefore improve their skills. Strengths, developmental needs and action plans are formulated to aid this learning. This increasing emphasis towards a competency-based training system is not unique to the UK; in the USA and in other countries around the world, there is a pressure to increase accountability and to formalise the maintenance of standards as well as setting standards for entry into practice (Southgate Citation1999). There is also a change of role definition and job responsibility towards health professionals other than doctors marking the assessments. This is particularly the case within ophthalmology, where an optometrist may be better qualified to assess a trainee performing retinoscopy, which is one of the skills required to progress to ST4, or an orthoptist may be suitable to assess the performance of visual fields or a cover test.

How WBAs are applied

For ST1 in ophthalmology, there are over 50 assessments to be completed. Trainees and trainers alike often lament that the forms take too much time to fill in. One study showed that the mean time taken to complete the mini-CEX (including feedback) was 25 min. The DOPS required the duration of the procedure being assessed plus an additional third of this time for feedback. The mean time required for each rater to complete his or her multi-source feedback (MSF) form was 6 min. (Wilkinson et al. Citation2008). With good preparation and organisation, it is possible to complete these forms and the benefits gained should outweigh the time taken to fill them in. Reduction in surgical experience means that more training will need to be undertaken on simulations, although experience and assessment in the operating room must remain the ‘gold-standard’. Simulation training will require the provision of properly resourced surgical skill facilities in every hospital. The key to reliable assessment and constructive feedback is well-trained trainers. Training is a skill that must be learned, and assessment and feedback techniques form part of this. It is often assumed that all consultants are trainers but this is clearly not the case.

How the results of WBAs are interpreted and used

The assessment of knowledge as defined in stages 1 and 2 of Miller's (Citation1990) pyramid is generally done using traditional assessment tools including written and oral tests. WBAs attempt to give a more accurate assessment of a doctor's competence because the doctor can ‘show how’ rather than ‘know’, thereby climbing up Miller's pyramid.

The mini-CEX assessment entails direct observation by an educational supervisor of a trainee's performance in real clinical situations (15–20 min) and is designed to assess skills such as history taking, clinical examination, communication skills, diagnosis and clinical management. The assessment is repeated on multiple occasions and can occur in various clinical settings such as clinic or ward rounds. The method has been shown to be reliable and to have construct validity, and to be a good method of education as well as an assessment tool. Mini-CEX has also been evaluated in the assessment of clinical skills in medical students in the USA (Kogan et al. Citation2002). One study by Holbmoe et al. (2004a) showed that the mini-CEX is a potentially powerful tool to provide high quality, interactive feedback that could contribute to improvement in trainees’ clinical skills. Direct observation of clinical skills is a critical first step in helping trainees to improve their clinical skills. The mini-CEX provides a reliable, structured format for performing direct observation (Wragg et al. Citation2003). However, the evaluation generated by the mini-CEX must lead to meaningful, useful feedback to promote growth in the trainees’ clinical skills. To be most effective, feedback needs to be interactive so the trainees can embrace and take ownership of their strengths and weaknesses. One study by Wragg et al. (Citation2003) showed that the use of self-assessment occurred infrequently and despite the substantial number of recommendations provided, the explicit development of an action plan was rare. The lack of action plan is particularly unfortunate because it suggests that many faculty may not be ‘closing the loop’ to ensure that the deficiencies noted were addressed by the intern.

The Royal College of Physicians developed the DOPS tools and report that directly observed performance is likely to be more valid and reliable than the previous log-book-based system (Wilkinson et al. Citation2003; Davies et al. Citation2005). Historically, competence in practical procedures has been assessed using logbooks and opinion of educational supervisors.

CBD focuses on evaluation of clinical reasoning by reviewing a case and the trainee's entries in the patients’ case notes. This assessment tool was developed based upon the GMC's performance procedures and its use has previously been described in primary care (Southgate et al. Citation2001).

MSF uses questionnaire data from eight colleagues in medical and non-medical assessing aspects of performance. MSF has been used mainly in industry and business (Atwater et al. Citation2002) to assess performance and as a means of providing feedback to trainees. The mini-peer assessment tool (mini-PAT) is an MSF tool that collates the views from a range of clinical colleagues and compares with a trainees’ self-assessment of performance. The rating and free-text comments from the eight assessors are then fed back to the trainee by the educational supervisor.

Together, the assessments provide evidence of the trainee having gained competencies defined by the Royal College of Ophthalmologists in their curriculum. The learning outcomes parallel those identified by the Accreditation Council for Graduate Medical Education (ACGME), USA. They identified six learning outcomes for postgraduate medical education: patient care, medical knowledge, interpersonal and communication skills, professionalism, practice-based learning and improvement and system-based practice. These outcomes have been related to training in emergency medicine (Harden Citation2006). In these publications, the knowledge, skills and attitudes that the trainee is expected to attain in each of the six competencies are described in more detail.

Criticisms of this method

For WBAs to be useful, they need to be more than a box ticking exercise. They must also have educational value (James et al. Citation2009). The most useful part of the assessment is in the narrative fields where feedback can be given, strengths and weaknesses discussed and an action plan is devised. Interactive and immediate feedback is important to help doctors improve and develop professionally (Johnson et al. Citation2008).

One of the difficulties with WBAs from a trainer's perspective is establishing the gold standard against which the trainee should be assessed. Training courses have been devised to help address this problem.

The European Working Time Directive has had a direct effect upon the training of junior doctors. As they spend less time on call, it is believed by some experienced doctors that trainees are often unable to perform procedures which were dealt with independently at their stage. Hence, it is very difficult to know what is ‘expected’ for their level. A study by Elbadrawy et al. (Citation2008) showed a significant decrease in the average number of surgical procedures performed by trainees from 1995 to 2005. It is easy to compare personal experiences and own perceived goals and targets too, thereby increasing your expectations.

Another difficulty of completing the assessments with trainees is the fact that there is such a variation of trainees present within any department within a hospital; from doctors who are training specifically in that particular specialty to those who are doing 4 or 6 months as part of general practice training. These trainees have a wide spectrum of skills, knowledge and motivation. There is much emphasis on gaining generic skills and core competencies, so this may be the perceived minimum standard. However, it would be unfair to have different expectations for trainees doing the same job, despite having different specialty choices, so the standard should be the same for all.

Trainers are not used to marking trainees as they should be at the end of the year. Receiving lower marks at the start of a placement can disappoint trainees, and this may result in lack of motivation. Different trainees will be at different stages throughout the year. Some will progress very quickly, whilst others will take a lot longer. What they should be able to do when they have completed the year should be the same so that they are able to progress to the next stage of training.

The trainee determines the timing and cases to be assessed. This can mean that they can choose simple cases and may not act as they would normally if they know they are being watched and assessed. Miller (Citation1990) identified four stages of development ‘knows, knows how, shows how and does’ as the cognitive and behavioural steps an individual progresses through from acquiring knowledge to performing a task in practice. Miller's triangle assumes that competence predicts performance which may not always be the case. In clinical practice, many other factors may influence clinical performance including time availability, tiredness, mood of the doctor and patient, etc. Knowing and showing does not mean that a doctor will perform in a certain way in real practice.

There is sometimes a difficulty in obtaining sufficient appropriate DOPs. In ophthalmology, procedures that are rarely carried out, such as obtaining blood cultures and venepuncture, form part of the curriculum. Many trainees would have performed these procedures during their foundation course, meanwhile many have not. The doctors assessing these trainees may not have performed these procedures in many years, and yet are expected to provide advice on improving technique. This can lead to trainees repeating skills which were learnt many years ago for the sake of completing the form for their portfolio, or many trainees competing to perform a particular procedure. The existing DOPS form also does not address the important aspect of postoperative care and instructions. DOPS should have been probably done by more than one assessor on multiple occasions during the year in order to give true reflection of the trainee's abilities. Indeed, the Royal College of Ophthalmologists specifies that two DOPS forms must be completed for each procedure.

Modifications and alternatives

Training is the key to the successful implementation of WBAs. Trainers need to be trained specifically with regard to assessment techniques. This may consist of a presentation day where the tools are introduced and a mock examination and CBD are carried out to be assessed by participants. Mini-CEX and CBD scoring can also be practiced, followed by a group feedback session to reach a consensus. One study by Holmboe et al. (Citation2004) showed that the faculty felt significantly more comfortable performing direct observation compared with control group faculty, having undergone direct observation of competence training of medical residents. Trainees would immensely benefit from knowing what specifics will be assessed in each form of assessment, and this could be included in the induction session. Another way of implementing this would be during the grand rounds or other study days. Training the assessors before introducing these tools and encompassing them at an early stage in medical school may also improve the way these assessments are done and the benefits gained from them. Without formal training, grading tends to be influenced by comments and reflections from personal experience, and the opinions from peers. However, it may not be possible to remove all trace of subjectivity, or even bias from previous experiences with the same doctor in the ward or clinic.

Equal attention needs to be given to the quality of both the observation and the feedback. Another study by Holmboe et al. (Citation2004b) suggests that faculty development is one key approach, but this study highlights two important points. First, the majority of faculty (17 out of 28, 61%) had participated in at least one workshop on feedback, thus highlighting the need for ongoing training and reinforcement. Such ‘reinforcement training’ can occur at section meetings, clinic conferences and clinical competency meetings. Second, feedback training should explicitly encourage teaching and practice of interactive feedback approaches.

Surgeons will need to follow the example of primary care, where trainers are selected from experienced general practitioners who demonstrate enthusiasm and ability. The reward for the trainer should be protected time for training and the reward for patients and the National Health Service will be better trained surgeons (Beard Citation2008).

WBAs are just one of the ranges of evidence of satisfactory completion of training, in addition to Royal College examinations, logbook of surgical activity and supervisor reports. WBA is a ‘shows how’ – a competence assessment, such as in a DOPS where they demonstrate a particular skill. Performance assessment, or ‘Does’ could be related to such exercises as video surveillance or peer review, such as that done in everyday situations.

Conclusions

This system of using WBAs to show the competency of doctors improves the public accountability of medicine and the transparency of standards. The assessment process when used in a formative way may identify deficiencies, problems and gaps in training which often arise due to variable or reduced clinical exposure as a result of reduced hours of work because of the European Working Time Directive. It also maximises the use of training opportunities as trainee and trainer know what they need to learn. Another advantage of the system would be that individuals could progress at their own pace and the system could be flexible to individual training needs. There are, however, potential disadvantages too. There are fears that by reducing each clinical skill into a list of components, the connections between these tasks may not be made and the system may not adequately assess the global competencies which a doctor needs.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

Notes

1. This AMEE guide was published as: Norcini J and Burch V. Citation2007. Workplace-based assessment as an educational tool: AMEE Guide no 31. Med Teach 29:855–871.

References

  • Atwater LE, Waldman DA, Brett JF. Understanding and optimizing multisource feedback. Hum Resour Manage 2002; 41: 193–208
  • Beard JD. Assessment of surgical skills of trainees in the UK. Ann R Coll Surg Engl 2008; 90(4)282–285
  • Carr SJ. Assessing clinical competency in medical senior house officers: How and why should we do it?. Postgrad Med J 2004; 80(940)63–66
  • Carr SJ. The foundation programme assessment tools: An opportunity to enhance feedback to trainees?. Postgrad Med J 2006; 82(971)576–579
  • Davies H, Archer J, Heard S, Southgate L. Assessment tools for foundation programmes—A practical guide. BMJ 2005; 330: 195–196
  • Elbadrawy M, Majoko F, Gasson J. Impact of calman system and recent reforms on surgical training in gynaecology. J Obstet Gynaecol 2008; 28(5)474–477
  • Harden RM. Trends and the future of postgraduate medical education. Emerg Med J 2006; 23(10)798–802
  • Holm HA. Postgraduate education. International handbook of research in medical education, GR Norman, CPM van der Vleuten, DI Newble. Kluwer, Dordrecht 2002; 381–413
  • Holmboe ES, Hawkins RE, Huot SJ. Effects of training in direct observation of medical residents’ clinical competence: A randomized trial. Ann Intern Med 2004a; 140(11)874–881
  • Holmboe ES, Yepes M, Williams F, Huot SJ. Feedback and the mini-clinical evaluation exercise. J Gen Intern Med 2004b; 19(5 Pt 2)558–561
  • James K, Cross K, Lucarotti ME, Fowler AL, Cook TA. Undertaking procedure-based assessment is feasible in clinical practice. Ann R Coll Surg 2009; 91(2)110–112
  • Johnson G, Barrett J, Jones M, Parry D, Wade W. Feedback from educational supervisors and trainees on the implementation of curricula and the assessment system for core medical training. Clin Med 2008; 8(5)484–489
  • Kogan JR, Bellini LM, Shea JA. Implementation of the mini-CEX to evaluate medical students clinical skills. Acad Med 2002; 77: 1156–1157
  • Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65(suppl)S63–S67
  • Southgate L. Professional competence in medicine. Hosp Med 1999; 60: 203–205
  • Southgate L, Cox J, David T, Hatch D, Howes A, Johnson N, Jolly B, MacDonald E, McAvoy P, McCrorie P, et al. The general medical council's performance procedures: Peer review of performance in the workplace. Med Educ 2001; 35(suppl 1)9–19
  • Wilkinson J, Benjamin A, Wade W. Assessing the performance of doctors in training. BMJ 2003; 327: S91–S92
  • Wilkinson JR, Crossley JG, Wragg A, Mills P, Cowan G, Wade W. Implementing workplace-based assessment across the medical specialties in the United Kingdom. Med Educ 2008; 42(4)364–373
  • Wragg A, Wade W, Fuller G, Lowan G, Mills P. Assessing the performance of specialist registrars. Clin Med 2003; 3: 131–134

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.