2,360
Views
15
CrossRef citations to date
0
Altmetric
Web Papers

Learning intimate examinations with simulated patients: The evaluation of medical students’ performance

, , , , &
Pages e139-e147 | Published online: 03 Jul 2009

Abstract

Background: For fifth-year undergraduates of the medical school, a project with simulated patients (Intimate Examination Associates, IEA) was implemented in 2002 at the University of Antwerp. In this project, students from the new curriculum (NC) learned uro-genital, rectal, gynaecological and breast examination in healthy, trained volunteers and received feedback focused on personal attitude, technical and communication skills. Former curriculum (FC) students however trained these skills only during internship in the sixth year after a single training on manikins.

Aims: This study assessed the effect of learning intimate examinations with IEAs by comparing students from FC and NC on four different outcome parameters.

Methods: Three groups were compared: FC after internships without IEA training, NC after internships with IEA training and fifth year NC immediately after the IEA training. Four assessment instruments: an OSCE using checklists and global rating scales to assess the technical skills, a score list on students attitudes and performance filled in by the IEAs, a student questionnaire on self-assessed competence and a questionnaire on the frequency of performing intimate skills during internships.

Results: Both NC groups scored globally better in the OSCE (significance for male examination). Sub-scores for ‘completeness’ and ‘systematic’ approach was significantly higher in both NC groups for male and female examinations. NC students reported better self-assessed competence and performance concerning gynaecological and urological clinical and communication skills during internship. The best results were obtained after IEA training and internship was done for the four outcomes. IEAs are influenced by the ‘experienced’ students after internship: FC and NC after internship both scored better than the fifth year NC who only received the IEA training so far.

Conclusion: Learning intimate examinations with IEAs has a positive effect on the performance of medical students. This beneficial effect is on its turn reinforced during internships.

Introduction

Programs with trained volunteers for gynaecological examinations (teaching associates) were frequently implemented in the training of future physicians in the USA, Canada, and the Netherlands starting in the 1980s (Van Lunsen Citation1986, Citation1987; Beckman et al. Citation1988, Citation1992). However, the use of teaching associates for genital and rectal examinations in men is less common (Rochelson et al. 2005). Nevertheless, these skills in both sexes are of value in different specialities: internal medicine, gynaecology, urology, primary care, emergency medicine, and sexual health (MacDougall et al. 2003). At the University of Antwerp, Belgium, an intimate examination associates (IEA) project with healthy simulated patients was implemented in 2002 in the new curriculum, for fifth-year undergraduates (Remmen Citation1999; Hendrickx et al. Citation2003, Citation2006).

Previous research already reported that students trained in pelvic examinations skills with trained lay women, showed improved pelvic examination skills compared to students in a traditional training programme, as well as an increased experience and confidence in the procedure (Popadiuk et al. Citation2002; Pickard et al. Citation2003; Carr et al. Citation2004). Results of questionnaires showed reduced stress and anxiety in students performing intimate examinations, and increased satisfaction and attention to patients’ feelings, integrity, and privacy (Wänggren et al. Citation2005). In all these studies, the outcome measured, was mostly an isolated aspect of one of the intimate skills, or the estimation of feelings such as self-confidence or anxiety of the students, in either male or female patients.

Important issues concerning the evaluation of programs with trained volunteers are still missing. The introduction of a new curriculum in the faculty of medicine at the University of Antwerp created the opportunity for a more comprehensive observational study including all different aspects of the intimate examination project. In the former curriculum (FC), students received clinical training for intimate examination skills during their education in the fifth year following a single 2-h session, using manikins in the skills laboratory. After this the students attended 2 months of internship in a gynaecological service, and no rotations in urology services were available. In 2002, a project with healthy female and male simulated patients (IEA) was implemented for fifth-year undergraduates: students learn genital, rectal, gynaecological, and breast examination together with healthy, trained volunteers (the IEAs), in the skills laboratory. Beforehand, the students attend a 2-h video and practical training on manikins in small group sessions, and a comprehensive handout is at their disposal. The IEA training sessions occur in a setting of two students, one trained IEA, and one physician, and the sessions last 1 h each. Each student attends two female IEA sessions (breast and gynaecological examination separate) and one male IEA session (including the genital and rectal examination). The students receive immediate feedback after each session which focuses on personal attitude, technical skills, and communication skills. Later during internships, all students attend 1 month of gynaecology internship and 2 weeks in urology service. These changes were implemented to improve students’ clinical skills (Remmen Citation1999).

The IEA project was approved by the Medical Ethical Committee of the University Hospital of Antwerp. All the volunteers signed informed consent for their engagement in the project. The description, the implementation and the assessment of the IEA project, together with detailed information about the students’ perceptions of each component of the program, and the perceptions of the volunteers and the teachers has been described previously (Hendrickx et al. Citation2006).

The main aim of our study was to investigate the effect of IEA training sessions on the following outcome parameters: first, the technical performance in genital, rectal, gynaecological, and breast examination skills; second, the self-estimated competence of students related to these skills; third, the perception of the students’ performance of the skills by the IEAs themselves and finally, the frequency of performing the intimate examination skills during internship.

Method

Subjects

Three groups of students were observed and compared ().

short-legendFigure 1.

Group 1 was composed of 104 FC students in year 7 who did not receive the IEA training and were evaluated after their internships.

Group 2 was composed of 71 new curriculum (NC) students in year 7 who received the IEA training in their fifth year and were evaluated after their internships.

Group 3 was composed of 58 NC students in their fifth year of medical school indicating that they were evaluated immediately after the IEA training but before starting their internships.

Every group contained ‘all’ the students of that particular study year. The number of students varied in each year.

Assessment instruments

Four assessment instruments were used: an objective structured clinical examination (OSCE) using detailed checklists and global rating scales (Harden et al. Citation1975; Dauphinee et al. Citation1997; Hodges et al. Citation1999; Frank Citation2006); a score list filled out by the IEAs; a student's questionnaire on self-estimated competence; and a student's questionnaire containing 72 items measuring levels of experience with intimate examinations during undergraduate education and internships (Remmen Citation1999).

Objective structured clinical examination

Two stations with IEAs, one on gynaecological skills and another on male genital examination skills, including rectal examination, were integrated in a larger OSCE. The IEA stations lasted 20 min each. For the OSCE, representative samples of each of the three groups were recruited. The baseline characteristics of the participants (N1 = 34/104, N2 = 32/71, N3 = 36/58) are shown in . The samples were based on gender, age, and school grades from year 1–5. The students were informed about the study and given the assurance that individual scores would be kept confidential. Participation was voluntary and students received a fee of €25. As the FC students had no prior experience with OSCEs, they were invited to an information session one day before the test. All but one student attended the information session. The general nature and settings of the OSCE were explained (Peeraer et al. Citation2008). The OSCEs took place in November 2003 (Group 3), February 2004 (Group 1), and September 2004 (Group 2).

Table 1.  Basic characteristics of sample objective structured clinical examination (OSCE)

Some students performed only one IEA station; others performed both, according to their rotation schedule for the large OSCE. We calculated that for an α of 0.05 and a power of 0.8 we needed 16 students per station to detect a statistically significant increase of the OSCE score by 10%. The number of students who performed in either of the OSCE stations varied between 15 and 21 in the different groups under investigation. In both stations, the students had to perform the technical skills with attention to communication and attitude towards the IEA. At the start of each station, the student had to explain to the IEA the kind of examination that will be performed, and describe the extent to which the IEA had to undress. Next, the students had to perform a genital inspection, a speculum examination, a bimanual palpation, and a pelvic examination of the female IEA. For the male IEA, genital inspection and palpation, a rectal and a pelvic examination had to be performed. The stations ended with the reassurance of the IEA about the findings. All observers were experienced medical doctors and were used to score the OSCEs (Peeraer et al. Citation2008). They were told that they would assess a regular fifth-year OSCE on the three dates. They were trained to fill out a checklist and to award a global rating for completeness, systematic approach, and proficiency of skills (score range 0–10, ≤5 indicating a fail). Finally, a ‘global OSCE score’ on 10 was calculated from these three global rating scores. The itemized checklists were approved by gynaecological and urological experts. The use of a global rating for completeness, systematic approach, and proficiency was based on literature (Van der Vleuten 1990; Regehr 1998; Hodges 2003). The scenarios of the IEA stations were exactly the same for every student from each group. In all three of the OSCE exams, the same observers assessed the students using the same checklists and without information on which group they were evaluating.

Each IEA was examined by a maximum of four students. The same IEAs were scheduled for the OSCE and the teaching sessions randomly.

IEA list

After the OSCE, the IEA filled out a scoring list about the students’ attitudes, technical skills and communication skills. The list was based on a validated scale by Van Lunsen (Van Lunsen Citation1986) and consisted of 35 items grouped in five topics: global attitude, global behavior, attitude during the IEA contact, attitude during the examination of the IEA and technical skills. All these five topic scores were converted into a score of maximum 10. The mean of these five scores was expressed as the ‘global IEA score’.

Self-assessed competence list

This questionnaire consisted of 12 questions about students’ self-assessment of competence and performance for technical and communication skills, in different clinical situations. Answers were presented on a Likert scale: strongly agree (score 4), partially agree (score 3), partially disagree (score 2) and strongly disagree (score 1). This resulted in a ‘global self-estimated competence score’ on 10. Cronbach's α for this questionnaire was calculated and found to be 0.836 indicating an acceptable validity.

Skills list

The skills list, validated by Remmen (Remmen Citation1999), consisted of 72 items measuring levels of experience with basic clinical skills relating to gynaecology and urology. Students were asked to rate their performance during undergraduate education, on an ordinal scale with three levels: level 1 (neither demonstrated to student nor performed), level 2 (demonstrated to student but not performed by student), and level 3 (performed by student once).

Second, they rated their experience during internship on an ordinal scale with five levels, ranging from level 1 (neither demonstrated to student nor performed), level 2 (demonstrated to student but not performed by student), level 3 (performed by student once), level 4 (performed by student approximately three times) and level 5 (performed by student at least ten times).

We scored 1 point for level 1, 2 points for level 2, 3 points for level 3, 4 points for level 4, and 5 points for level 5. For each student we calculated two global scores (global gynaecology experience during internship and global urology experience during internship). They were calculated as the mean of the scores for gynaecological and urological skills.

Students of FC Group 1 and of NC Group 2 answered the self-assessed competence list and the skills list. Since students of Group 3 still had to start their internships, they did not complete these lists. The students received the questionnaires to complete them at home, and return them after 2 weeks. To improve the response rate, K. Hendrickx contacted the students by phone after 3 weeks to remind them to answer the questionnaires.

Statistical analysis

The statistical analysis was performed with SPSS version 12.

The distributions of continuous data were tested for normality by the Kolmogorov Smirnov test. All distributions of data were normal.

For the OSCE and IEA list (Groups 1, 2 and 3), the differences between the three student groups were calculated using one-way analysis of variance with Student–Newman–Keuls (α < 0.05) as post hoc test when appropriate. For the self-assessed competence list (Groups 1 and 2), the differences between the two groups were tested by a Students’ t-test. For the skills list (Groups 1 and 2), Chi square tests were calculated for each item on the list. A p-value < 0.05 was considered statistically significant. The mean of all frequencies of performed skills during internships was calculated and expressed in a global gynaecology performance and a global urology performance score. These scores were statistically analysed using Student's t-test (p < 0.05).

Results

The basic characteristics of the participants in the OSCE were not significantly different from previous study years concerning age, gender, and study results. They were representative for the total group of invited students (). Race, ethnicity and religion were not taken into account because 98% of the students were Caucasian Belgian students.

Students of Group 1 from the FC (65/104, response rate 62.1%) and of Group 2 from the NC (48/71, response rate 68.1%) answered the self-assessed competence list and the skills list.

OSCE

For the urology station, both the NC groups (Group 2 after internship and Group 3 just after the IEA training) scored significantly higher on the global OSCE score as well as on the subscores of completeness, systematic approach and proficiency ().

Table 2.  Scores of the three study groups in the OSCE

Concerning the gynaecology station, significant differences in outcome were observed between the NC Group 2 and the FC Group 1 after their internship. Within the NC students, all IEA-trained students who had finished their internship (Group 2) reached significantly higher scores on the ‘global score’ and the ‘systematic approach’, compared to the NC Group 3 before the start of their internship, whereas no significant differences were observed for completeness and proficiency.

No differences were observed between male and female students in the three groups, and in both the stations.

IEA scores

In the urology station all the students who had already attended their internship (Group 1 FC and Group 2 NC) received similar scores from the IEAs (). For Group 2 (NC, after internship) the scores were significantly higher compared to Group 3 (NC, without internship) on the ‘global score’, ‘global behaviour’ and ‘attitude (contact and examination)’. The score for attitude during the IEA examination was significantly higher for the FC students (Group 1) compared to the NC students without internship (Group 3).

Table 3.  Scores of the three study groups on the intimate examination assistant (IEA) list

In the gynaecology station, the differences between the six scores were not statistically significant. There were no differences between the scores for male and for female students in the three groups and in both stations.

Self-estimated competence list

The self-estimated competence list was filled out by 65 FC students (response rate 62.5%) and 48 NC students (response rate 68.1%) (). Both groups had completed their internships during the previous year and were seventh-year undergraduates by the time they completed the list. Those from the NC had received the IEA training in their fifth year.

Table 4.  Scores of the two total study groups for the self-estimated competence list

The ‘global self-estimated competence score’ was significantly higher in NC students than in FC students.

The questions about ‘being able to perform specific skills’ had generally better scores in the NC students. Significantly higher scores were achieved in more complicated examination skills and in communication skills. More complicated skills were: to examine a man with complaints, the pelvic evaluation of a man, to explain rectal examination to a male person, to examine a woman with abdominal pain, to examine a woman with vaginal discharge, to perform pelvic evaluation of a woman with urinary complaints, and to explain gynaecological examination to a female person. The performances of rather ‘routine’ skills such as the examination of a man or of a woman without complaints, the performing of breast examination, and performing a PAP smear were not significantly different between students in both curricula.

Gender differences occurred only in the FC students, showing that male students scored significantly higher for self-estimated competence for the rectal examination of the male patient, and for the male and female pelvic muscle evaluation.

Skills list

The global scores both for the gynaecological and for the urological skills were significantly higher in the NC students compared to the FC students (). This means that the NC students performed significantly more intimate examinations during their internships.

Table 5.  Global scores of the 2 total study groups for the skills list

Looking at every individual item on the skills list, there was also a significant difference between NC and FC students in all skills exercised during medical education except for the cervical smear. For the skills performed during internships, significant differences were generated for the more complicated or more ‘profound’ examinations such as the evaluation of the pelvic muscles, the inspection of scrotum and penis, and the IUD insertion, in contrast with more routine skills – the FC students scored higher for the cervical smear and the general female clinical examination.

Male students in the FC performed more urological examinations during their internship than female students did, which was not the case with the NC students.

Discussion

Urological and gynecological skills

Both groups in the NC students (before and after their internship) had significantly higher scores for the male examination as well as higher subscores for male examinations in the OSCE. Concerning the female examination, these results were less pronounced: only students of the NC who had completed both the IEA training and their internship scored significantly higher than FC students. The same trend was observed for NC students after only IE training (before internship); however, these differences reached no statistical significance. The increase in global scores, especially for the urological skills, may be explained by the fact that students of the FC did not have a skills training with simulated patients. The FC students had to learn these skills on their own during internships. Previous research shows that in the past, some students had no relevant experience at all (Hendrickx et al. Citation2006). On the contrary, in the new programme, every student is obliged to participate and the sessions are extensively prepared and introduced to the different partners. The obligation to take part in the program is an important step forward to ensure that every medical student has at least one opportunity to experience and to perform an intimate examination in a patient-centred way and in a safe environment (Hendrickx et al. Citation2006).

Female FC students showed less urological skills. These differences disappear in the NC – another indication that the IEA training followed by obligatory clinical work in urology services improves the students’ performance for intimate examinations skills.

Internship

The best results for self-assessment of competence and actual performance, both for technical and communication skills, were attained by the NC students after IEA training and after their internship. Nevertheless, in the NC, the length of the gynaecology internship is limited to 1 month instead of 2 months as in the FC. On the other hand, all the NC students are obliged to spend 2 weeks in urological service, which was not the case in the FC. We may conclude that the training in the skills lab is an essential prerequisite for benefiting from the skills performance during internships. Skills training does not replace internships, but adds depth to the clinical work. These conclusions also fit the fact that the global OSCE scores for gynaecological skills of NC students became only significantly higher compared with FC students after the combination of IEA training and internship.

The influences of internship are also reflected in the IEA scores: both groups of students (FC and NC) who attended their internships had higher IEA scores. A possible explanation could be that students learned during their clinical work to ‘act as a doctor’. The IEAs seem to be sensitive to this ‘doctor's attitude’ which possibly overshadows the lower level of technical skills of the FC students as judged by the OSCE. In fact, it seems that the students were able to hide deficiencies in technical skills by their ‘doctor's attitude’ towards the IEAs. Since the NC students scored significantly higher both quantitatively and qualitatively on the skills after internship, our hypothesis is that ‘being better prepared’ leads to higher self-assessed competence which is reflected in 'taking more initiative’ in the clinical setting.

Another factor that might influence the skills performance during internship is the students’ personality. It is not only training and internship that influence the performance of intimate examinations, but also personality factors may play a role in explaining why some students may encounter more difficulties than others (Manuel et al. Citation2005).

Factors such as religion and ethnicity were not taken into account in our study since the study population included 98% white Caucasian students but in other composite student populations, these factors could also be important.

Strengths

Several articles confirm that simulated patients are a valid and a reliable tool for the evaluation of clinical competences (Ladyshewsky Citation1999; Adamo Citation2003).

In this study, we provide evidence that IEA training increases the completeness, the systemic approach, and the proficiency of the physical examination in urological and gynaecological simulated patients and results in an increased performance rate during internship, especially concerning complicated skills. Details from set-up concerning recruiting, training, and retaining IEAs, costs and workload, and program set-up are described in a previous article (Hendrickx et al. Citation2006). The question arises about the possible use of IEAs in other medical schools in other countries where such an approach is not part of the mainstream culture of medical education.

First, the added value of this study lies in the fact that this is the first time that an intimate examination programme is evaluated concerning the skills, the behaviour and the attitude, in a cohort of students, by different observers: the student, the teacher, and the patient. Students were assessed on four different levels, including their performance of intimate examination skills during internship. On the one hand, no other study used this multi-level approach (Popadiuk et al. Citation2002; Pickard et al. Citation2003; Carr et al. Citation2004; Wänggren et al. 2005; Burch et al. Citation2005) and on the other hand, the assessment occurred mostly before practice on clinical patients (Jünger et al. Citation2005). The triangulation of the different test results enhances and supports the validity of our results.

Second, the IEAs were involved in the OSCE stations and the scoring of the students, which is not the case with other studies as intimate examinations in OSCEs are mostly performed on manikins (Popadiuk et al. Citation2002; Pickard et al. Citation2003; Carr et al. Citation2004; Wänggren et al. Citation2005; Burch et al. Citation2005). Only a few studies about breast examination and gynaecological examination include IEAs (Dugoff et al. Citation2003; Chalabian et al. Citation1998). Another interesting approach of our study is the fact that the IEAs assessed the students’ technical, communication and attitude skills immediately after the examination.

Third, in this IEA programme, both male and female intimate examinations were included, whereas mostly only female examinations are included (Chalabian et al. Citation1998; Popadiuk et al. Citation2002; Dugoff et al. Citation2003; Pickard et al. Citation2003; Robertson et al. Citation2003; Carr et al. Citation2004; Jünger et al. Citation2005; Wänggren et al. Citation2005). The participation of male IEAs in OSCEs is therefore quite novel (Siber et al. Citation2000). Each student went through three sessions of 1 h, at different times: (1) urogenital, rectal, and pelvic muscles examination of a male IEA, (2) gynaecological, pelvic muscles evaluation, and (3) breast examination of a female IEA. The skills were taught in a very complete manner, using checklists with all the successive steps of the examinations explained in detail (Hendrickx et al. Citation2006).

Finally, only a few studies regard the results of similar self-assessed competence surveys, however none of them include both male and female examinations. In our study, the NC students seem to be more confident in performing both male and female intimate examinations, especially concerning the more difficult and complicated skills as well as communication skills. Carr et al. (Citation2004) and Burch et al. (Citation2005) confirm the positive shifts in student-perceived competence to perform a gynaecological examination. Popadiuk et al. (Citation2002) concludes in a study with rectal teaching associates that this method is effective for increasing skills and students’ confidence in the procedure.

Therefore the novel approaches of our study are: the multi-level approach, the involvement of IEAs in the OSCE, the inclusion of female and male examinations, and the gathering of data concerning the students’ self-estimated competence.

Limitations

First, the students in our study participated in the OSCEs on a voluntary basis which may imply that we recruited the more motivated and engaged students. However, the basic characteristics of the OSCE student population did not differ from the global students’ population (). On the other hand, motivated students may take more initiative during their internship and may create a higher yield of their practical training. On the contrary, we cannot completely exclude that the differences observed between the different groups of students might be a result of our global educational curricular change instead of only being related to the elaborated skills training programme. Other studies about the curriculum change at the University of Antwerp however, underlie the positive influence of the skills training programme (Peeraer et al. Citation2008). Another limitation concerns the sample size of the students participating in the OSCEs: because of practical and financial reasons we were not able to assess the complete year group (Peeraer et al. Citation2008).

Second, the OSCE observers were not completely blinded for the curriculum origin (FC/NC) of the students. Therefore, one could speculate that observers may estimate more experienced (NC) students higher by definition. Literature, however, shows that trained observers and simulated patients are a valid and a reliable tool for the evaluation of clinical competences (Ladyschewsky Citation1999; Adamo Citation2003; Holmboe et al. Citation2004). Since our observers were trained and habituated in the OSCE scenario and since they were not informed on the study population attending that particular OSCE examination, we feel reassured that this bias was minimized as much as possible. Finally, the use of the same detailed checklist for the observers, their training, and the standardization of the OSCE stations, limited the possible bias. The IEA observers on the other hand, were not aware of the curriculum the students were part of.

Third, a potential limitation is the self-assessment of competence and actual performance of intimate examinations by means of the skills list, asking students for their skills practice during internships, during which students might score themselves non-realistically. The retrospective way of collecting these data – although similar in Group 1 FC and Group 2 NC – might induce a recall bias as Eva et al. (Citation2004) and others indicate a poor correlation between self-assessment and performance. Nevertheless, as we did not ask the students to rate themselves but only to report the frequency of performing skills, and as the students were not evaluated one way or another on their scores, we consider the answers to the questionnaire to be sound. In future, the use of logbooks might be a better base for collecting similar data.

Conclusion

The results of our study show an improvement of students’ skills after intensified training in urological and gynaecological skills with IEAs before internship.

Our IEA programme shows an improvement in the way students perform intimate examination skills systematically in male and female. It leads to a higher self-confidence in performing these complicated skills during internship. As a consequence, a more appropriate attitude during the patient encounter and the physical examination of the patient may be reached. The triangulation of four methods of measurement instruments reinforces the conclusion that an IEA programme provides added value in the medical undergraduate training.

Acknowledgements

The authors would like to thank Professor Paul Wallace, Department of Primary Care and Population Sciences, University College London, Professor Etienne Vermeire, Professor Paul Van Royen, and Professor Joke Denekens, all from the Department of General Practice, University of Antwerp, for revising the article critically.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

Additional information

Notes on contributors

Kristin Hendrickx

KRISTIN HENDRICKX, MD, PhD, conception and design, acquisition of data, analysis and interpretation of data, drafting the article or revising it critically for important intellectual content, final approval of the version to be published.

Benedicte De Winter

DE WINTER B. Y., Prof. MD, PhD, conception and design, acquisition of data, analysis and interpretation of data, drafting the article or revising it critically for important intellectual content, final approval of the version to be published.

Wiebren Tjalma

TJALMA W. A. A., Prof. MD, PhD, acquisition of data, revising critically for important intellectual content, final approval of the version to be published.

Dirk Avonts

AVONTS D, Prof. MD, PhD, revising critically for important intellectual content, final approval of the version to be published.

Griet Peeraer

PEERAER G, acquisition of data, revising critically for important intellectual content, final approval of the version to be published.

Jean-Jacques Wyndaele

WYNDAELE J. J., Prof. MD, PhD, acquisition of data, drafting the article or revising it critically for important intellectual content, final approval of the version to be published.

References

  • Adama G. Simulated and standardized patients in OSCEs: Achievements and challenges 1992–2003. Med Teach 2003; 25(3)262–270
  • Beckmann CR, Barzansky BM, Sharf BF, Meyers K. Training gynaecological teaching associates. Medic Educ 1988; 22: 124–131
  • Beckmann CR, Lipscomb GH, Williford L, Bryant E, Ling FW. Gynaecological teaching associates in the 1990s. Med Educ 1992; 26: 105–109
  • Burch VC, Nash RC, Zabow T, Gibbs T, Aubin L, et al. A structured assessment of newly qualified medical graduates. Med Educ 2005; 39: 723–731
  • Carr SE, Carmody D. Outcomes of teaching medical students core skills for women's health: The pelvic examination educational program. Am J Obstet Gynaecol 2004; 190: 1382–1387
  • Chalabian J, Dunnington G. Do our current assessments assure competency in clinical breast evaluation skills. Am J Surg 1998; 175: 497–502
  • Dauphinee WD, Blackmore DE, Smee S, Rothman AI, Reznick R. Using the judgements of physician examiners in setting the standards for a national multi-center high stakes OSCE. Adv Health Sci Educ Theory Pract 1997; 2: 201–211
  • Dugoff L, Everett MR, Vontver L, Barley GE. Evaluation of pelvic and breast examination skills of interns in obstetrics and gynaecology and internal medicine. Am J Obstet Gynecol 2003; 189(3)655–658
  • Eva KW, Cunnington JPW, Reiter HI, Keane DR, Norman GR. How can I know what I don’t know? Poor self assessment in a well-defined domain. Adv Health Sci Educ 2004; 9: 211–224
  • Frank C. Evidence based checklists for objective structured clinical examinations. Brit Med J 2006; 333: 548
  • Harden RM, Stevenson M, Dowie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Brit Med J 1975; 1: 447–451
  • Hendrickx K, de Winter BY, Wyndaele JJ. How medical students are being taught at University of Antwerp. Brit Med J 2003; 326: 1327
  • Hendrickx K, de Winter BY, Wyndaele JJ, Tjalma WAA, Debaene L, Selleslags B, Mast F, Buytaert P, Bossaert L. Intimate examination teaching with volunteers: Implementation and assessment at the University of Antwerp. Patient Educ Couns 2006; 63: 47–54
  • Hodges B, McIlroy JH. Analytic global OSCE ratings are sensitive to level of training. Med Educ 2003; 37: 1012–1016
  • Hodges B, Regehr G, McNaughton N, Tiberius R, Hanson M. OSCE checklists do not capture increasing levels of expertise. Acad Med 1999; 74: 1129–1134
  • Holmboe ES, Hawkins RE, Huot SJ. Effects of training in direct observation of medical residents’ clinical competence. Ann Intern Med 2004; 140: 874–881
  • Junger J, Schafer S, Roth C, Schellberg D, Ben-David M, Nikendel C. Effects of basic clinical skills training on OSCE performance. Med Educ 2005; 39: 1015–1020
  • Ladyshewsky R. Simulated patients and assessment. Review. Med Teach 1999; 21(3)266–269
  • MacDougall J. Commentary: Teaching pelvic examination-putting the patient first. Brit Med J 2003; 326: 100–101
  • Manuel RS, Borges NJ, Gerzina HA. Personality and clinical skills: Any correlation?. Acad Med 2005; 80(10)30–33
  • Peeraer G, Scherpbier AJ, Remmen R, de Winter BY, Hendrickx K, van Petegem P, Weyler J, Bossaert L. Clinical skills training in a skills lab compared with skills training in internships: Comparison of skills development curricula. Educ Health 2007; 20(3)125
  • Peeraer G, Muitjens AM, de Winter BY, Remmen R, Hendrickx K, et al. Unintentional failure to assess for experience in senior undergraduate OSCE scoring. Med Educ 2008; 42(7)669–675
  • Pickard S, Baraitser P, Rymer J, Piper J. Can gynaecology teaching associates provide high quality effective training for medical students in the UK? Comparative study. Brit Med J 2003; 327(7428)1389–1392
  • Popadiuk C, Pottle M, Curran V. Teaching digital rectal examinations to medical students: An evaluation study of teaching methods. Acad Med 2002; 77: 1140–1146
  • Regehr G, McRae H, Reznick RK, Szalay D. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med 1998; 73: 993–997
  • Remmen R. An evaluation of clinical skills training at the medical school of the University of Antwerp. PhD in Medical Sciences dissertation. (ISBN 90-5728-016-7), University of Antwerp. 1999; 1–136
  • Robertson K, Hegarty K, O’Connor V, Gunn J. Women teaching women's health: Issues in the establishment of a clinical teaching associate program for the well woman check. Women Health 2003; 37: 49–65
  • Rochelson BL, Baker DA, Mann WJ, Monheit AG, Stone ML. Use of male and female professional patient teams in teaching physical examination of the genitalia. J Reprod Med 1985; 30: 864–866
  • Sibert I, Grand’Maison P, Doucet J, Weber J, Grise P. Initial experience of an objective structured clinical examination in evaluating urology students. Eur Urol 2000; 37(5)621–627
  • van der Vleuten C, Swanson DB. Assessment of clinical skills with standardized patients: State of the art. Teach Learn Med 1990; 2: 58–76
  • van Lunsen R. Who is afraid of the gynaecological examination?. PhD in Medical Sciences dissertation. University of Groningen. 1986; 1–410
  • van Lunsen R, Soeters D. Assessment of some aspects of medical competence by means of evaluation by non-professional patients. Further developments in assessing clinical competence, Hart, Harden, 1987; 468–481
  • Wanggren K, Petterson G, Csemiczky G, Gemzell-Danielsson K. Teaching medical students gynaecological examination using professional patients-evaluation of students’ skills and feelings. Med Teach 2005; 27(2)130–135

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.