13,072
Views
77
CrossRef citations to date
0
Altmetric
Research Article

Peer feedback as an aid to learning – What do we want? Feedback. When do we want it? Now!

, , , &
Pages e105-e112 | Published online: 28 Jan 2011

Abstract

Background: With 360° appraisals integral to professional life, learning how to give constructive feedback is an essential generic skill.

Aim: To use a formative objective structured clinical examination (OSCE) for skills acquisition and development in giving feedback, whilst facilitating awareness of the importance of communication skills in clinical practice.

Methods: Medical and nursing students took part in a formative OSCE. Using actors as simulated patients, a three-station OSCE circuit was repeated three times so that students could rotate through the roles as ‘candidate’, ‘examiner’ and ‘observer’. As ‘candidates’, they received immediate feedback on their consultation from the ‘examiner’/‘observer’. The events were evaluated using a questionnaire and focus groups.

Results: Students immensely valued this learning event for considering expectations for a performance (91–100%). Concerns around giving peers feedback were acknowledged, and they were divided on preference for feedback from peers or tutors (48% versus 52%). But training in providing feedback and criteria for assessment were considered helpful, as was instruction by faculty to give corrective feedback to peers.

Conclusions: Peer observation and professional accountability for giving constructive feedback enhanced awareness of their skills education and training needs. It also opened the dialogue for identifying opportunities for peer assessment and feedback to support work-based education and skills development.

Introduction

With 360° appraisal as a part of professional life, learning how to give constructive feedback should be viewed as an essential generic skill. Results of the National Student Survey have verified the significance of feedback as a key marker of education quality (www.thestudentsurvey.com, www.hefce.com). However, the challenges are not only to provide feedback opportunities, but also signpost their existence within the curriculum.

The manner in which feedback is given influences its effectiveness, and training should be regarded as essential for gaining competence in this core communication skill (Hewson & Little Citation1998; Nicol & MacFarlane-Dick Citation2006). Moreover, the ability to provide colleagues with feedback should be a basic obligation for creating a culture of good team-working, patient safety and the delivery of quality healthcare (Frankel et al. Citation2006).

Feedback is constructive for learning when it is immediate, given sensitively and draws attention to any disparity between the perceived and actual performance (Nicol & McFarlane Dick Citation2006; Sargent et al. Citation2007). It is well documented that in academic settings which include socially constructed learning, students learn more effectively when the assessment also includes peer feedback (Topping Citation2005; van den Berg et al. Citation2006). An essential feature of training and confidence in clinical communication and skills is the use of multi-source feedback on performance, e.g. by self-, peer, simulator, tutor, (Ende Citation1983; Braend et al. Citation2010; Paskin and Peile Citation2010).

Studies on peer-assisted learning have highlighted the acceptability and benefits within three areas of the learning process, i.e. the learning environment, reciprocal educational exchange and the communication and modelling (Glynn et al. Citation2006; Ten Cate and Durning Citation2007a, b). Expansion and corroboration of knowledge are meta-cognitive skills which are promoted when peer-assisted coaching is facilitated via discussion and practice (Ladyshewsky Citation2002). Although opinions on the validity and reliability of peer feedback are polarised (Henning et al. Citation2008), with appropriate faculty guidance it can introduce challenge and variability into the learning environment (Kernan et al. Citation2005). Moreover, reciprocal peer coaching and feedback can encourage co-operative learning and increase learner motivation (Asghar Citation2010).

The purpose of our educational intervention for students was twofold: (1) to gain feedback on their clinical communication skills and (2) to learn how to provide feedback to peers.

The peer tutors were equipped with appropriate information via the objective structured clinical examination (OSCE) mark sheet criteria to guide the content of their feedback (Brazeau et al. Citation2002). The feedback was employed as an aid to learning, to narrow the gap between perceived and evident skills competence (Boud Citation2000) and to encourage professional engagement with colleagues’ progress (Ende Citation1983). Thus, the aims of the study were to investigate the benefits of peer feedback using a formative OSCE in communication skills and to develop a training programme in peer feedback.

Methods and results

The student sample

All Year 1 students (n = 93) on the graduate entry programmes (GEP) in medicine (Barts and The London School of Medicine and Dentistry, Queen Mary University of London, n = 45) and nursing (City University, London, n = 48) had taken the multi-professional workshops on communication skills and were scheduled to take part in a formative OSCE sessions. These students had been learning together throughout their first academic year on the clinical communication and problem-based learning courses. The communication skills course involved a mixture of didactic, observational and experiential learning using simulated patients and role-play.

Students were given a presentation on the principles of giving constructive and balanced feedback according to the Agenda-Led Outcome-Based Assessment (Kurtz & Silverman 1998) and a modification of Pendleton rules for clinical consultation (Pendleton et al. 1992).

The formative OSCE stations

OSCE stations were based on their problem-based learning cases used in this course which had the appropriate knowledge content for both the medical and specialist nursing (adult, child and mental health) students. Students derived their own OSCE marking criteria in the first instance using a constructivist approach (Bergman et al. Citation2008); these schemes were compared with the mark sheets previously prepared by faculty () and the similarities and discrepancies were discussed. In this way, the feedback could be linked to shared goals which were relevant and meaningful for both student and faculty members (Nicol & MacFarlane-Dick Citation2006). The formative OSCE sessions were run twice for the same cohort of students, in November 2008 and in May 2009.

Table 1.  Example of Feedback Sheet

Study 1 – November 2008

Each OSCE station was of 5 min duration followed by 3 min for feedback from the ‘observer’. The three-station OSCE circuit was repeated three times so that students could rotate through the roles as: (1) ‘candidate’, (2) ‘examiner’ and (3) ‘observer’. When in the role of ‘examiner’ or ‘observer’, students remained in the same station for the whole of the circuit. Students in the role of ‘candidate’ rotated through the three stations of the circuit and received immediate feedback that they could act upon in the subsequent two stations where they received further feedback from different peers. The rationale for this design was for students to practice giving constructive feedback with a view to promoting both their own learning and their peers’ professional behaviours (Biggs Citation2003).

Evaluation strategy

The study was submitted to the university research ethics committees for review. The design of the study was approved as was its evaluation strategy which was considered to be part of a normal student evaluation process. The sessions were evaluated to determine the perceived value of this approach by means of:

  1. A questionnaire comprising 20-statement items with a four-point Likert scale response administered at the end of the OSCEs (in November and May). (This was an adapted version of the Assessment Evaluation Questionnaire used in the FAST project (Formative Assessment in Science Teaching project, Open University with Sheffield Hallam University (Brown et al. Citation2003).) All questionnaire data were gathered and reported anonymously to avoid any identification of subjects.

  2. The focus group participants were recruited by inviting students from the whole cohort to volunteer. This added qualitative data to an understanding of the educational process and happened in January 2009 (six weeks after the first OSCE session).

The nominal group technique (NGT) was used in the focus group whereby students: (1) answered three questions individually and silently; (2) shared answers to each question with the group and listed on a flip-chart; and (3) voted on their three top priorities from each list. The focus group also included a more general discussion. The whole session was taped and transcribed.

Results of Study 1

Of the whole cohort (n = 93), 78 students attended the formative OSCE giving an 84% attendance rate. All attendees completed an evaluation questionnaire. The 20 statements and responses (Likert scale scores) are presented in and .

  1. Evaluation questionnaire

Table 2.  Responses to positive statements of the questionnaire

Table 3.  Responses to negative statements of the questionnaire

Table 4.  Results of the focus group following Study 1

Students agreed or strongly agreed (range 82–100%) with all the positively framed statements about feedback (). With the more neutral statements: ‘Feedback mainly tells me how I did compared to others’, 77% disagreed/strongly disagreed () and ‘I would learn better if the feedback had been in more depth’, 44% agreed/strongly agreed (). These findings would suggest that the majority of students found the individual feedback helpful personally and not just comparative and was of sufficient depth to aid learning. Of particular note was that 91% of the students agreed/strongly agreed with the statement ‘I learnt new things whilst being an examiner’.

Students generally disagreed or strongly disagreed with the negatively framed statements about feedback (). In response to the statement, ‘I would prefer feedback from a tutor rather than my colleagues’; 84% of students disagreed/strongly disagreed. Although 27% agreed/strongly agreed that it was difficult to give colleagues feedback, 94% agreed/strongly agreed that they were able to be honest in their feedback ().

  •  (2) Focus group

The focus group held after the first OSCE session was attended by five medical students and two nursing students (six female, one male). lists the benefits, limitations and ideas for improvements that were derived using the NGT. A number of themes arose from the discussions that highlight students’ views about this educational initiative.

Theme 1: Some students emphasised their own status as beginners both as part of the experience of being in an OSCE and as a potential limitation to learning.

‘It's all about us being – it's amateurs doing it’.

‘Amateurs. Sums us up, really!’

This corresponded with the uncertainties voiced in both making and receiving judgements on skills; questioning whether novice feedback was reliable when compared with experts’ feedback. The outcomes of these responses included the suggestion that self-reflection on communication as an outcome of the exercise was more important than specific advice. Another was the potential usefulness of feedback even when contradictory, not least because of variations in communication needs with different patients and scenarios. A positive outcome identified was the learning derived from observation of techniques used by peers. Even so, a lack of confidence in the first OSCE (November 2008) associated with novice status was referred to by most students.

Theme 2: There was a recurrently expressed anxiety about giving negative or corrective feedback and the following responses reflected this:

‘There was somebody in our group that said something and I thought, “You really should never say that in front of somebody”, and I didn’t, to be honest, have the guts to tell them’.

‘You want to say something positive but also you want to get across what they need to improve. I found it quite difficult to say everything I wanted to say without coming across as being horrible. And I’m sure everyone feels that you don’t want to just knock someone's confidence right down. But it can be quite difficult, I think, because if you’re peers, it's harder than if you’re a tutor’.

Although the principles of giving feedback had been covered in the briefing, students wanted more help with how to give constructive criticism. In particular, they asked for the directive to give feedback for improvement so that individuals would not be thought unkind.

‘If you’re actually put in a position where you have to give a negative comment, then you can say that thing that you won’t -

‘Everyone's doing it’.

Theme 3: The presence of peers prompted mixed response, for whilst some found it easier to relax, others found it pressurising.

‘I think the artificial situation was exacerbated by the pressure to perform in front of your friends. If you were going to talk to a lady in a GP's surgery, you wouldn’t have [group members] sitting there with a mark sheet. It just makes you relax less, and you behave less like you normally would in a situation. Personally, I felt like the major downfall was having two people that you know very well sitting there watching you’.

Although the focus group offered some criticisms of the OSCE format, it was acknowledged consistently by all seven students, including those sceptical of communication skills sessions, that it was a very positive and useful experience that they wanted to be repeated. There was some support (three votes) for the suggestion of an introductory briefing on how to give positive and negative feedback.

Theme 4: The transferability of learning from the OSCE setting to subsequent clinical placements arose in discussion. There were mixed views on its use in subsequent encounters with patients. One student was very clear that it had not been.

‘I don’t think you can learn communication sitting in a room talking to an actor. I think you learn communication by talking to people … As your relationship with somebody grows, you just know the right things to say, because it suits the person that you're with’.

However, others were able to cite instances where they had applied the skills included in their OSCE feedback.

[When with a patient who wanted a long conversation], I had to draw on the OSCE session: OK, how do I make sure that I'm actually quite interested in what you're saying now? Because I've got three other patients that I really need to deal with’.

‘In terms of actual feedback, I think there were some useful bits just the tone of the language you came in with, the, Hi and which sort of a hello. Or types of language, just some of the really little bits, either it being fed back to you or using it on someone else. I think they did definitely make me think, and I’d take those away and use them’.

‘I thought I'd done something quite well, and someone pointed out that actually I could have done it this way and it might have been better. And then seeing someone else do it, I thought, ‘Oh well, yeah, that's a much better way of doing it’. But it hadn’t even occurred to me that I could improve on the way I was doing it. And I don’t think it's until someone points it out to you in quite a specific way that you think, ‘Well actually, yeah, I’ll try that’. And I've done it since, and it is better’.

Some issues raised in the focus group could not be used for adjusting the subsequent session (Study 2) as they reflected on the intrinsic nature of simulation as a training method, i.e. that simulation is not fully realistic, the potential for temporary de-skilling, and performance anxiety in observed simulations. But some suggestions were relatively easy to implement, such as an increased time for feedback albeit this meant a reduction in the time spent interacting with the ‘patient’.

Study 2 – May 2009

A second formative OSCE session was run for the same cohort of students with amendments to the format based on student evaluations from the first study. These were that the case scenarios were circulated in advance to allow preparation and greater time allotted for feedback (each station was reduced to 4 min duration allowing 4 min for feedback from the observer/examiner). A major change from Study 1 involved using a training DVD that gave examples of good and poor practice and the students then rehearsed how to phrase feedback (both good practice and areas for improvement). At the core of this instruction was a process for widening their repertoire of communication. Therefore, students were facilitated in being aware of how to identify problems and good practice in providing feedback, being mindful of role play as an ‘imperfect’ consultation and a source of performance anxiety. The simulated patients were instructed to give written feedback on empathy and clarity to ensure the evaluation of peer verbal feedback alone without the ‘halo’ effect of the actors.

Results of Study 2

There were 48 students (52%) in attendance at the formative OSCE in May. All students completed the evaluation questionnaire again. Responses to the 20 statements are given in and , and were largely consistent with the findings in the November OSCE. For statements on feedback phrased in a positive direction, almost all students either agreed or strongly agreed (90–100%). A total of 38% agreed/strongly agreed with the statement ‘I found it hard to give feedback to my colleagues’ compared with 27% in Study 1; a difference that was not statistically significant. Once again almost all, (94%) agreed/strongly agreed that they had been honest in their feedback ().

As in Study 1, for statements phrased negatively (), the students overwhelmingly disagreed or strongly disagreed; 85% of students disagreed/strongly disagreed with the question, ‘Feedback mainly tells me how well I did compared to others’, which would mean they considered the feedback given to be useful for their own development rather than simply as a comparison with others.

There were two findings that were statistically significant between Studies 1 and 2: The statement ‘I would learn better if the feedback had been in more depth’ on this occasion had an agreement/strong agreement response of 67% compared to 44% in Study 1; a finding that was significantly different (p < 0.05). With the statement on the preference for feedback from tutors rather than colleagues, 48% disagreed/strongly disagreed whilst 52% agreed or strongly agreed with this statement indicating that students were divided on this viewpoint. This differed from the evaluation in Study 1 where 84% disagreed/strongly, a finding which was significant (p < 0.01).

The number of students who took part in the second OSCE was lower, and direct comparison cannot be made because the student questionnaires were anonymised, so we could not match students attending both sessions. But the findings were affirmative for all statements, and very similar in both events with the exception of the two statistically significant findings cited.

Although we did not repeat the focus group evaluation, students were asked to offer free text comments. Students endorsed the changes made and appreciated doing this formative OSCE again.

‘Really good – nice to feel that we have improved since the last one, both in interviewing and in giving feedback to colleagues’

‘Very useful learning experience. Well carried out and gave some extremely helpful feedback’

Discussion

With greater significance placed on learner autonomy, guidance on good practice in self- and peer appraisal has been recognised as preparation for life-long learning and multi-source feedback in professional life. This study, involving a timetabled formative OSCE session on communication skills, investigated the perceived benefit of peer feedback on students’ learning.

It is accepted that giving feedback, especially when negative or corrective, is not a straightforward process because of its potential impact on the sense of self and capability of the recipient (Sargent et al. Citation2007). Indeed, results of the project are applicable to feedback in general and not just peer feedback. The extent to which feedback is regarded as enabling rather than judgemental is therefore important (Weaver Citation2006), and may be discounted if it comes from a person of low–level knowledge (Bing-You et al. Citation1997). It is noteworthy that students valued the learning opportunity of being an examiner both for clinical communication skills development and in the observation of their peers as they gave feedback ( and ).

In analysing student perceptions of difficulties in providing peer feedback, Lui and Carless (Citation2006) highlighted potential reasons for this challenge. They included reliability owing to perceived novice status, the disruption in power relations when submitting a performance for scrutiny, and the resultant competitive spirit engendered, as well the possible impact of ‘friendship marking’ on feedback. These observations resonated with other studies on peer feedback in vocational education and the influence of inter-personal variables including psychological safety and confidence that the group will not embarrass, reject or punish someone for speaking up so allowing inter-personal risks in a group (Edmondson Citation1999), the value of diversity, inter-dependence and trust on learning (Van Gennip et al. Citation2010).

That the students in the study continued to find it difficult to give feedback even following the training session could be interpreted in a variety of ways. It could indicate the failure of the training, although in the plenary, students commented on its usefulness and of the opportunity to rehearse in advance of the OSCE circuits. Alternatively, it should be considered that as a result of this educational initiative, students’ awareness of the skills needed were enhanced for observing, making judgements and providing balanced and constructive feedback that covered positive and corrective points.

A higher proportion preferred tutor over peer feedback in Study 2 compared with Study 1 which might suggest that the relative value of peer evaluation had decreased. But this also should be interpreted with a degree of caution because the cohorts could not be precisely matched owing to the questionnaire data being anonymous. Another factor to be considered was with the imminence of summative high-stakes exams, students may have wanted reassurance from the faculty who would soon be assessing them.

The increased desire for yet more feedback attests to the value students place on the exercise, with several commenting that they would have also liked ‘verbal’ feedback from the simulated patient. However a key aim of these sessions was learning how to give and receive peer feedback, without the additional ‘halo’ effect of feedback by the actors.

In reality, reciprocal peer tutoring and feedback brings with it a fascinating social construct for learning and can have a positive influence on skills development (Ladyshewksy Citation2000, Citation2002). Reports of senior peer tutors teaching those in earlier years describe benefits in learning which have reinforced peer tutors’ knowledge as well as developing their teaching skills (Evans & Cuffe Citation2009). Nonetheless, its acceptability and utility is dependent on learner group dynamics, and their cognitive, personal and communication needs. Such activities can be affirmative, particularly with perceived novices, perhaps because it is often positive rather than negative and also build confidence and self-esteem. Although reticence in giving constructive criticism would imply that a critical component of reflective practice may be lost (Shin et al. Citation2009). But the usefulness of this learning model can reinforce the development of key clinical skills and address issues around tutor–student ratios in an educational setting (Chambers et al. Citation2000).

Whilst feedback on performance and assessments is an agreed requisite at all levels of education, it involves definition and description on the competency being assessed for peers to provide constructive and valid feedback (Patri Citation2002; McKay et al. Citation2007; Archer et al. Citation2008; Nelson & Schunn Citation2009). In our study, students were of the same academic year, i.e. true peers, and the OSCE mark sheet provided the framework of criteria for constructive peer feedback. The activity offered an opportunity for collaborative and vicarious learning (Cox Citation2008; Gielen et al. Citation2010; Van Gennip et al. Citation2010).

Views on relative lack of experience and that of their peers reaffirmed the work of Sargent et al. (Citation2007). They described three inter-related factors for credibility and acceptance of negative feedback: (1) whether feedback was perceived as credible which itself was linked to issues around who was giving the feedback; its specificity; whether it was a result of direct observation (evidence); and if it was related to agreed and explicit standards of performance. (2) The emotional factor where the feedback might challenge the person's self-perception and image of being a ‘good doctor’ and result in defensive behaviours or resistance to accepting feedback. (3) The time needed for reflecting on the feedback given in order that a negative response may be mediated, assimilated and accepted and then utilised.

The lower attendance rate at the second OSCE may raise reservation on the value that students placed on this activity. However this was not a surprise, owing to the forthcoming uni-professional summative assessments which students inevitably prioritised over attendance at multi-professional educational activities. It is noteworthy that attendance at other activities of the inter-professional curriculum strand also declined at this time for the same reason. These are perennial issues with an inter-professional versus a uni-professional activity which have been reported in the literature (Westwood et al. Citation2008).

Whilst satisfaction is not a reliable reflection on the quality of feedback (Boehler et al. Citation2006), students regarded the peer feedback as conducive to learning. It met some of the criteria of Nicol and MacFarlane-Dick (Citation2006) for effective feedback, i.e. it took place in an atmosphere of support and challenge, it was linked to the goals developed and clarified with the students and it aimed to ‘close the gap’ between current and desired behaviour.

Some students reported transferability of skills to the workplace following the session, but it is recognised that lasting positive outcome requires immediate application and support to be sustained (Heaven et al. Citation2006). Holding the two formative OSCE sessions 6 months apart provided opportunities for further development and reflection on skills acquisition. Despite an OSCE format being a rather pressured event with limited time to reflect, clarify and assimilate feedback, students were very affirmative about these sessions as a learning experience.

Conclusion

This project involving two health professional disciplines (medicine and nursing) allowed each student the opportunity to observe six colleagues and give structured feedback, and receive immediate and applied feedback on three personal performances. As a consequence of this study, a practical and acceptable model for learning communication skills and of how to give feedback has been developed. An advantage of the three-station OSCE format was the multiple opportunities for honing students’ observations, to then give feedback on focussed cases, and to maximise the use of resources including time, student engagement and faculty.

Although their novice status was acknowledged, as was the desire for additional feedback from the ‘simulated patient’ and faculty tutor, students found this to be a valuable learning experience. Following this successful project with the small cohort of two health professional student groups on the GEP programmes, this educational activity has been introduced for the 5-year MBBS programme. The next step is to build on this peer feedback learning model, to support knowledge and skills development in clinical settings where there are shared learning experiences.

Acknowledgements

Authors wish to thank Cathy Baker, Viv Cook, Eva Najberg, John Prendergast, Adrienne Kirk, Kathy Curtis for their help. This project was supported by funding from Centre for Excellence in Teaching and Learning (CETL) in Clinical and Communication Skills, City University, London and Barts and The London, Queen Mary University of London.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of this article.

References

  • Archer J, Norcini J, Southgate L, Heard S, Davies H. Mini-PAT (Peer Assessment Tool): A valid component of a National Assessment Programme in the UK?. Adv Health Sci Educ 2008; 13: 181–192
  • Asghar A. Reciprocal peer coaching and its use as a formative assessment strategy for first-year students. Assess Eval High Educ 2010; 35(4)403–417
  • Bergman D, Savage C, Wahlstrom R, Sandahl C. Teaching group dynamics – Do we know what we are doing? An approach to evaluation. Med Teach 2008; 30(1)55–61
  • Biggs J. Teaching for quality learning at university, 2nd. SRHE and Open University Press, BerkshireUK 2003
  • Bing-You RG, Paterson P, Levine MA. Feedback falling on deaf ears: Residents’ receptivity to feedback tempered by sender credibility. Med Teach 1997; 19: 40–44
  • Boehler ML, Rogers DA, Schwind CJ, Mayforth R, Quin J, Williams RG, Dunnington G. An investigation of medical student reactions to feedback: A randomised control trial. Med Educ 2006; 40: 746–749
  • Boud D. Sustainable assessment: Rethinking assessment for the learning society. Stud Contin Educ 2000; 22(2)151–167
  • Braend AM, Gran SF, Frich JC, Lindbaek M. Medical students' clinical performance in general practice – Triangulating assessments from patients, teachers and students. Med Teach 2010; 32(4)333–339
  • Brazeau C, Boyd L, Crosson J. Changing an existing OSCE to a teaching tool: The making of a teaching OSCE. Acad Med 2002; 77(9)932
  • Brown E, Gibbs G, Glover C. 2003. Evaluation tools for investigating the impact of assessment regimes on student learning. <http://bio.ltsn.ac.uk/journal/vol2/beej-2-5.htm>.
  • Chambers SL, Schmittgen J, Allan CR. Evaluation of peer teaching in a pharmaceutical laboratory. Am J Pharm Educ 2000; 64: 283–288
  • Cox R, 2008. Vicarious learning and case-based teaching: Developing health science students’ clinical reasoning skills. Teach Learn Res Brief Oct 2008, No. 54. <www.tlrp.org>
  • Edmondson A. Psychological safety and learning behaviour in work teams. Adm Sci Q 1999; 44: 350–383
  • Ende J. Feedback in clinical medical education. JAMA 1983; 250: 777–781
  • Evans DJR, Cuffe T. Near-peer teaching in anatomy: An Approach for deeper learning. Anat Sci Educ 2009; 2: 227–233
  • Frankel AS, Leonard MW, Denham CR. Fair and just culture, team behaviour, and leadership engagement: The tools to achieve high reliability. Health Serv Res 2006; 4(2)1690–1709
  • Gielen S, Peeters E, Dochy F, Onghena P, Struyven K. Improving the effectiveness of peer feedback for learning. Learn Instr 2010; 20: 304–315
  • Glynn LG, MacFarlane A, Kelly M, Cantillon P, Murphy AW. Helping each other to learn – A process evaluation of peer assisted learning. BMC Med Educ 2006; 6: 18
  • Heaven C, Clegg J, Maguire P. Transfer of communication skills training from workshop to workplace: The impact of clinical supervision. Patient Educ Couns 2006; 60: 313–325
  • Henning JM, Weidner TG, Marty MC. Peer assisted learning in clinical education: Literature review. Athl Train Educ J 2008; 3: 844–890
  • Hewson MG, Little ML. Giving feedback: Verification of recommended techniques. J Gen Intern Med 1998; 13: 111–116
  • Kernan WN, Quagliarello V, Green ML. Student faculty rounds: A peer-mediated learning activity for internal medicine clerkships. Med Teach 2005; 27(2)140–144
  • Ladyshewksy RK, Developing health professionals through the use of reciprocal peer coaching. In Herrmann A, Kulski MM, editors. Flexible futures in tertiary teaching. Proceedings of the 9th Annual teaching learning forum, 2000 February 2–4. Perth: Curtin University of Technology. <http://lsn.curtin.edu.au/tlf/tlf2000/ladyshewsky.html>
  • Ladyshewsky RK. A quasi-experimental study of the differences in performance and clinical reasoning using individual learning versus reciprocal peer coaching. Physiother Theory Pract 2002; 18(1)17–31
  • Lui NF, Carless D. Peer feedback: The learning element of peer assessment. Teach High Educ 2006; 11(3)279–290
  • McKay J, Murphy DJ, Bowie P, Schuck M-L, Lough M, Eva KW. Development and testing of an assessment instrument for formative peer review of significant event analysis. Qual Saf Health Care 2007; 16: 150–153
  • Nelson MN, Schunn CD. The nature of feedback: How different types of peer feedback affect writing performance. Instr Sci 2009; 37: 375–401
  • Nicol D, MacFarlane-Dick D. Rethinking formative assessment in HE: A theoretical model and seven principles of good feedback practice. Stud High Educ 2006; 31(2)199–218
  • Paskin Z, Peile E. Final year medical students’ views on simulated-based teaching: A comparison with best evidence medical education systematic review. Med Teach 2010; 32: 569–577
  • Patri M. The influence of peer feedback on self- and peer- assessment of oral skills. Lang Test 2002; 19(2)109–131
  • Sargent J, Mann K, Sinclair D, van der Vleuten C, Metsemakers J. Challenges in multisource feedback: Intended and unintended outcomes. Med Educ 2007; 41: 583–591
  • Shin E, Wilkins E, Ainsworth J. Perception vs reality peer feedback practices to enhance preservice teacher reflection during an elementary early clinical experience. 2009, <http://www.allacademic.com/meta/p35597_index.html> Accessed 6 June 2009
  • Ten Cate ThJ, Durning S. Dimensions and psychology of peer teaching in medical education. Med Teach 2007a; 29: 546–552
  • Ten Cate ThJ, Durning S. Peer teaching in medical education: Twelve reasons to move from theory to practice. Med Teach 2007b; 29: 591–599
  • Topping KJ. Trends in peer learning. Educ Psychol 2005; 25(6)631–645
  • van den Berg I, Admiraal W, Pilot A. Peer assessment in university teaching: Evaluating seven course designs. Assess Eval High Educ 2006; 31(1)19–36
  • Van Gennip NAE, Seger MSR, Tillema HH. Peer assessment as a collaborative learning activity: The role of interpersonal variables and conceptions. Learn Instr 2010; 20: 280–290
  • Weaver M. Do students value feedback? Student perceptions of tutors’ written responses. Assess Eval High Educ 2006; 31(3)379–394
  • Westwood OMR, Leinster S, Weinberg JR. A health care curriculum for the 21st century: Time for flexibility? J. R Soc Med 2008; 101(2)59–62

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.