1,361
Views
12
CrossRef citations to date
0
Altmetric
Research Article

The Ventriloscope® as an innovative tool for assessing clinical examination skills: Appraisal of a novel method of simulating auscultatory findings

, , &
Pages e388-e396 | Published online: 22 Jun 2011

Abstract

Background: Simulation is increasingly used as a teaching tool and in assessment. The Ventriloscope® (VS) is a new auscultation simulator. This modified stethoscope allows pre-recorded sounds (activated wirelessly) to be integrated with a simulated patient (SP, professional actor).

Aims: This study explores the instrument's potential for overcoming limitations of current objective structured clinical examination (OSCE) assessment by increasing validity while retaining reliability.

Methods: After training SPs to synchronise the device with their breathing (recreating abnormal signs), we evaluated the VS during a third year undergraduate medical student OSCE. Students (n = 385), examiners (n = 19) and SPs (n = 10) completed post-exam questionnaires which were analysed using a coding framework. OSCE performance data were analysed using Stata 10.

Results: When ‘compared to their usual stethoscope’ 40% of students found no difference in using the VS; 69% found it easier to identify sounds; 68% found examination with the VS very or fairly realistic when ‘compared to examining a real patient’. Examination scores were comparable with other OSCE stations.

Conclusions: The VS reliably provided consistent ‘abnormal’ auscultatory signs within an OSCE framework. Using a VS may increase OSCE validity, allowing examiners to assess students' application of knowledge in a realistically simulated setting. The VS can help bridge the gap between simulation and real patients.

Introduction

The objective structured clinical examination (OSCE) is a well-established method for assessing clinical skills of medical students (Harden & Gleeson Citation1979; Newble & Swanson Citation1988). Ideally, these skills would be assessed on real patients and would test students’ abilities to recognise and interpret physical signs. However, difficulties associated with finding large numbers of well patients with appropriate, consistent and stable clinical signs have seen increased utilisation of simulated patient (SP) and standardised patient. This has improved the standardisation and reliability of assessing medical students, but raises issues of validity and authenticity (Dauphinee Citation1995).

SPs are professional actors or lay people (usually healthy) trained to simulate a variety of medical problems in a consistent, reliable, realistic and reproducible manner. The SPs being ‘well’ poses a problem when assessing learners’ ability to recognise abnormal clinical signs and to apply and integrate knowledge. When faced with a SP in an OSCE, medical students may perform on ‘auto-pilot’, carrying out the motions of examination knowing that there will be no abnormality to detect. Their ability to identify abnormal signs and interpret them in a clinical context is therefore not assessed. This pattern of ‘tick-box’ learning at undergraduate level may explain the numerous reports of poor diagnostic ability of postgraduate trainees (Mangione & Nieman Citation1997; Peitzman et al. Citation2000; Ozuah et al. Citation2001; Houck et al. Citation2002).

Attempts have been made to overcome such limitations with the aid of simulation technology (Peitzman et al. Citation2000; Ozuah et al. Citation2001; Houck et al. Citation2002; Morgan & Cleave-Hogg Citation2002), particularly to facilitate learning of audible clinical signs. These include audio recordings, multimedia CD-ROMs, electronic heart sound simulators and manikins (Gordon et al. Citation1980b; Mangione & Nieman Citation1997; Issenberg et al. Citation1999). Simulators have been shown to enhance physical examination skills of learners in undergraduate and postgraduate settings (Gordon et al. Citation1980a; Issenberg et al. Citation2002; Issenberg et al. Citation2005). Students have also been shown to value simulation-based teaching very highly (Weller Citation2004). However, simulators are separate from real people, and therefore cannot recreate the interpersonal aspects of a clinical encounter.

Postgraduate high stakes examinations have begun to utilise simulation technology as part of the assessment of clinical competence (Dillon et al. Citation2004; Hatala et al. Citation2005). Furthermore, a key action point from the Chief Medical Officer's recent report states that simulation should be ‘fully integrated and funded within clinical programmes for clinicians at all stages’ (Department of Health Citation2009). With these considerations in mind it would be prudent to introduce simulation devices into medical school exams. Thus far, use of simulation technology in examinations has consisted mostly of the candidate or student examining a SP and then stopping their flow of examination to concentrate on a form of audio-visual simulation (Hatala et al. Citation2008) or focusing solely on video clips (Lieberman et al. Citation2003; Millos et al. Citation2003). This break in the flow of examination unarguably impairs perceived realism.

The artificial separation of auscultatory signs from their clinical context fragments the learning of key skills. Work on hybrid simulation has highlighted the importance of placing a real person at the centre of a clinical encounter (Kneebone et al. Citation2006). While this has been explored for a range of procedural skills, the integration of simulators and real people for assessing the interpretation of clinical signs has not been addressed.

New technology offers the potential to overcome this artificial separation between simulator and patient. The Ventriloscope® (VS) (Castilano et al. Citation2009; Lecat Citation2010) () resembles a stethoscope but incorporates an inbuilt MP3 player. Pre-recorded auscultatory sounds on a secure digital memory card can be played through this MP3 player and heard via the stethoscope ear pieces. The sounds are controlled by a remote transmitter. When examining an SP, the student will hear simulated pre-recorded sounds just as they would with a normal stethoscope.

Figure 1. Lecat's VS (reproduced with permission of Dr Paul Lecat).

Figure 1. Lecat's VS (reproduced with permission of Dr Paul Lecat).

This offers a means of 'grafting' abnormal auscultatory signs onto a healthy person, requiring students to recognise and interpret such signs within the wider context of a clinical encounter. From an assessment perspective, the ability to select from a variety of authentic 'signs' offers obvious benefits in terms of consistency and reliability.

This study aimed (1) to evaluate whether the VS could be used reliably to provide audible clinical signs for students to integrate with the rest of the clinical examination in an OSCE setting, (2) to investigate opinions of students, examiners and SPs regarding use of the VS as an assessment/examination tool and (3) to compare students’ exam results with other stations requiring physical examination in a third year OSCE. To our knowledge, this is the first study of such a simulation device in the UK.

Methods

Setting and study participants

The subjects were 358 third year medical students undertaking the end of year OSCE exam at Imperial College School of Medicine. The OSCE was carried out on 1 day, across three hospital sites in London. One of the 12 stations was modified to incorporate the VS. Strategies were put in place to enable the station to continue in the event of VS malfunction or failure. In accordance with our usual examination practice, students were prevented from talking to each other at changeover.

Anonymous questionnaires were given to each of the students and to 19 examiners about their experience in the station where the VS was used.

The students received a general OSCE briefing 2 months before the exam in which they were informed about the VS being introduced into one of the exam stations. They were not given prior practice with the VS.

Design and data collection

To explore participants’ perceptions and interpretations of their experience, we selected a mixed method's research design including questionnaire feedback.

Training of SPs

Two researchers (Himanshu Bhatt and Anju Verma) familiarised themselves with the workings of the VS and designed clinical scenarios in which the VS could simulate clinical signs of respiratory disease.

An actor experienced in medical student exams and OSCE-type scenarios was given training on how to use the VS by two of the authors (Himanshu Bhatt and Anju Verma). The SP was responsible for coordinating the appropriate button(s) on the transmitter device with inhalation and exhalation as the candidate/student auscultated their chest.

The VS can produce a full range of clinical auscultatory sounds such as heart, respiratory and bowel sounds and bruits. In this study, respiratory sounds were chosen to overcome any difficulties associated with a lack of peripheral signs (such as abnormal pulse) and issues with timing with a pulse. The station simulated a patient with moderate asthma as part of which the VS was used to provide polyphonic expiratory wheeze throughout the lung fields. All 358 students in the exam undertook this OSCE station.

Training of examiners

As part of normal exam practice all examiners are expected to attend a hands-on training session observing and marking three stations. Each examiner was additionally sent information explaining what the VS was and how it works. In the OSCE station, the SPs briefed the 19 examiners on the mechanics of the VS. Circulating senior examiners ensured that examiners in the respiratory station were comfortable using the VS. As part of this all examiners were asked to examine the SP's chest using the VS. Examiners were CCT accredited physicians or general practitioners.

Pilot – mock OSCE and semi-structured interviews

The VS was piloted in the respiratory station of a mock OSCE using the same trained actor (SP). The scenario was repeated in exam style with four different examiners and four candidates. This exam scenario could be visualised by video link in a main seminar room by a group of 30 volunteers from the academic faculty at Imperial College, comprising examiners, actors and administrative examination officers. Each scenario was also recorded to make training videos.

Focus group discussion with the observers after each run through of the respiratory station enabled fine tuning of the equipment set-up in the room, and ensured face validity of the station. Content validity was assessed by comparing the content of the station against the undergraduate curriculum.

Himanshu Bhatt and Anju Verma carried out semi-structured interviews with examiners and candidates who participated in the mock OSCE to help construct questionnaires (see below).

End of third year OSCE station – set-up

After creating suitable training videos Himanshu Bhatt and Anju Verma undertook training of a group of 20 actors (SPs) until they were comfortable and competent at using the VS.

The final OSCE station set up () involved the SP on a couch in a hospital gown with the transmitter device hidden under a hospital blanket. Hidden out of view were loudspeakers (connected to the transmitter) to allow actor and examiner to synchronously hear what the student was hearing through the VS ear pieces to ensure consistent use of the equipment by the SP. This station was replicated across three different OSCE sites with 10 simultaneous circuits and thus 10 VSs in use at any one time. This set up was very similar to other physical examination stations in the OSCE.

Figure 2. OSCE station set-up.

Figure 2. OSCE station set-up.

Mark scheme for respiratory station using VS

We based the mark scheme on a standard respiratory examination station marking scheme used at Imperial College that routinely assesses communication, consultation, professionalism, interpretation of data and diagnostic ability, including criteria for correctly identifying the abnormal sound and for correct diagnosis.

OSCE – questionnaires

The questionnaire was based on the emergent themes from semi-structured interviews that authors Himanshu Bhatt and Anju Verma conducted. Separate questionnaires were created for students, examiners and SPs. Each contained questions using five-point Likert responses, multiple choice and space inviting free text responses. Questionnaires were distributed, completed and collected at the end of the third year OSCE to maximise return rates.

Exam scores

Exam scores were collated in a standard manner by the Undergraduate Medical Office at Imperial College, as for all other stations in the OSCE.

Data analyses

Responses from completed questionnaires were collated in an Excel spreadsheet. Multiple choice and Likert-scale questions were analysed using descriptive statistics. Himanshu Bhatt and Anju Verma individually coded free text responses from the questionnaires using a coding framework (Cresswell Citation1998; Strauss & Corbin Citation1990). They then met to conduct between-coder comparisons to assess inter-rater reliability. Paul Booton and Roger Kneebone individually coded samples of free text responses. All four authors then met and after extensive discussion reached agreement about the definition of coding categories and themes.

OSCE scores in the respiratory (RS) station (using the VS) were compared with scores at the breast (Br) examination, cardiovascular (CVS) examination and lower limb (LL) examination stations. These stations were chosen for comparison as they all involved physical examination.

Chi square or Fisher's exact test (Altman Citation1991) was used to compare categorical variables such as proportions of students achieving the maximum score, proportions passing but not including the top score, proportions attaining the borderline score and proportions failing. This method was also used to compare student and examiner responses to identical questions. Cronbach's alpha was calculated to assess reliability and internal consistency (Cohen Citation1992). Stata version 10 was used for analyses (Texas, USA; Stata Corporation).

Results

From the questionnaires sent out, 286 were returned by medical students (79.9% return), 17 were returned by examiners (89.4% return) and 10 were returned by SPs (100% return).

Student questionnaire responses

When ‘compared to their usual stethoscope’ 40% of student respondents found no difference in using the VS and 69% reported it being easier to identify sounds. 68% reported examination with the VS was very or fairly realistic when ‘compared with examining a real patient’. 48% of respondents did not feel that using the VS ‘changed my examination technique’, with a further 19% reporting a better exam technique. 76% of respondents were not aware that sounds were also being played over a loudspeaker ().

Table 1.  Student questionnaire responses

Thematic analysis of the written comments revealed five themes: ease of use, sound quality, clinical examination, realism and students’ expectations (). Some of the students’ comments within these themes are included.

Table 2.  Thematic analysis of free text comments – students

Examiner questionnaire responses

When ‘compared to their usual stethoscope’ 41% of examiner respondents found no difference in using the VS and 47% reported no difference in identifying sounds. 41% of respondents felt the sound clarity was the same as their usual stethoscope and 47% reported it as better. ‘Compared to other OSCE stations’ 53% of examiner responses indicated that using the VS made the station feel more like a real patient examination (). The examiner free text responses were mainly about realism and station set-up: ‘more like a real patient with signs’, ‘useful as we can hear as the student is auscultating’.

Table 3.  Examiner questionnaire responses

SP questionnaire responses

All SPs (actors) reported receiving appropriate training in using the VS, 40% reported that using the VS ‘made my role more like a real patient’ and 60% felt that the VS helped their performance as an actor. 70% of SP respondents found the loudspeakers ‘bettered’ their use of the VS and 80% reported loudspeakers made standardising their role ‘easier’.

Comparison of student and examiner responses to identical questions

There was no difference between opinions of student and examiner regarding the ease of use of the VS and identification of sounds. A greater proportion of student respondents reported better sound clarity than examiner respondents and a greater proportion of examiner respondents found the station more like a real patient encounter than student respondents ().

Table 4.  Comparison of student and examiner responses to the same questions

Student exam results/performance data

There was no evidence of difference between the proportions of students at each OSCE station involving physical examination that achieved the maximum score, the proportion that passed (but not including the maximum score), the proportion that attained the borderline pass mark and the proportion of students who failed per station when compared with the respiratory station ().

Table 5.  Exam score comparisons across four stations

Of the 358 students, 266 (74%) identified the abnormal sound in the VS and gave the correct diagnosis. Of those of who correctly identified the abnormal sound (294 of 358) 90% also gave the correct diagnosis. Of all the students that gave the correct diagnosis (315 of 358) 84% identified the abnormal sound correctly.

There was very strong correlation between student scores in the respiratory station and the overall scores (Spearman's correlation coefficient 0.97, p < 0.001), and strong correlation between students passing the respiratory station and passing overall (Spearman's correlation coefficient 0.72, p < 0.001).

The overall reliability of the 12 station OSCE as measured by Cronbach's alpha was 0.65.

Technical appraisal

All ten VSs ran successfully throughout the OSCE. No problems were reported by SPs or examiners in using the VS or any associated equipment in the exam station. Plans for action in case a device failed were not activated. We recharged the VSs over the lunch break as a precaution as we were uncertain of battery life in this situation.

Discussion

Main findings of the study

This study describes the integration of a novel auscultatory simulation within an OSCE setting, with minimal disruption to the candidate's flow of examination. The VS reliably produced consistent simulated sounds and there were no episodes of VS failure or malfunction. The VS has been well received by students, examiners and SPs (actors). Its utilisation did not affect student exam scores when compared with other OSCE stations involving physical examination, and showed strong correlation with overall OSCE scores. The majority of student respondents reported that using the VS imposed no change in their examination technique, and two thirds felt using the VS conveyed a sense of realism.

What is already known

Although simulation technology has been incorporated into assessment settings (Dillon et al. Citation2004; Hatala et al. Citation2005; Hatala et al. Citation2008), there are no reports of seamless integration of simulated auscultatory sounds into a standard physical examination where the SP is in control of a device to ‘produce’ the clinical abnormality. But, it has been shown that assessment of skills within a relevant clinical context is possible in operating theatre settings (Black et al. Citation2006) and during performance of certain procedures (Kneebone et al. Citation2002; Kneebone et al. Citation2005). It is imperative that the acquisition of clinical skills is not segregated from the clinical context or oversimplified (Kneebone Citation2009). We have tried in this study to adhere to these important principles by sampling multiple aspects of a real clinical encounter – communication, professionalism, identification of the clinical abnormality, interpretation of data and accurate diagnostic ability.

A recent review of simulation-based medical education research outlines 12 features and best practice points in order to ‘use medical simulation technology to maximum educational benefit’ (McGaghie et al. Citation2010). Our study, based on assessment, incorporates the following: outcome measurement, simulation fidelity and high stakes testing. Furthermore, we could apply more ‘best practice points’ by utilisation of the VS in a teaching setting, thus allowing for feedback, curriculum integration, deliberate practice, skill acquisition, mastery learning and instructor training. Consideration to postgraduate learners would allow for transfer to practice and team training.

Strengths and limitations of the study

The VS is highly practicable; it was straightforward to train actors (SPs) and staff in its use and it worked flawlessly in a large-scale real life exam. The positive comments from both students and examiners support the quality of the simulation. The VS also worked well within the time constraints of the exam. The VS were re-charged during the OSCE lunch break but we are not sure if this was necessary.

Incorporating the VS in a station contributes to face validity as it allows the candidate/trainee to hear abnormal breath (or other) sounds just as they would in a normal clinical examination. In addition, we have shown improved construct validity as the presence of ‘pathology’ increases the number of learning objectives that the station is able to test.

All students heard the same simulated abnormality. This consistency should improve the reliability of the VS station over a real patient station as it removed case variability. The examination setup did not allow us to test this.

A small number of students gave the correct overall diagnosis for the OSCE station (‘asthma’), but failed to identify the auscultatory findings correctly. The possible explanations for this include the presence of other cues in the station such as a history and a peak flow chart, students guessing the sound and students finding ways of communicating with each other between circuits.

Ours was not a case-control study, thus we did not compare students exposed to the VS with those who were not exposed to the VS. We also did not compare student's performance using the VS at the respiratory station against their performance at a respiratory station involving a real patient with clinical signs.

The range of use of the VS in an exam setting is potentially limited by the ability to simulate the other signs that comprise a clinical presentation. For instance, in attempting to simulate aortic stenosis, the ejection systolic murmur and sounds of carotid radiation could be recreated with the VS, but a heaving apex, precordial thrill and slow rising pulse are difficult or impossible to bring into the simulation (Cline Citation2004).

At over £1000 per Ventriloscopes® the cost implications of such a device are significant.

Nevertheless, the VS is a step forward in the development of simulation techniques. It could also be a valuable teaching tool in which audible clinical sounds can be heard over and over again without the worry of patient fatigue.

Conclusions

Appropriate integration of simulated physical signs using SPs in an examination setting can help move away from the reductionist approach to assessment that is often found in medical schools. We have shown that the VS contributes to the authenticity of a clinical simulation and thus brings it closer to a real patient encounter and is both consistent and practical to use in a real examination setting.

The technology we have evaluated offers a 'menu' of normal and abnormal auscultatory findings which can be integrated at will with a real patient, thereby increasing validity without sacrificing reliability in assessment. This authenticity lies at the heart of effective clinical practice, where many skills and behaviours must be interwoven.

This innovative use of hybrid simulation addresses some of the constraints around assessment of clinical skills. By locating the detection and interpretation of clinical signs within a clinician-patient encounter (rather than as a decontextualised exercise), simulation can approach the authenticity of clinical practice while overcoming some of the practical difficulties. In this way, graded levels of diagnostic challenge can be designed, tailoring assessment to the evolving needs of learners as their skills and experience develop. This has wide implications for education at undergraduate and postgraduate levels.

Acknowledgements

We thank Dr Graham Easton (Clinical Teaching Fellow, Imperial College) for his significant role in supervising the OSCE station at one of the examination sites, Miss Joanna Murray (PhD student, Imperial College) and Mr Joseph Eliahoo (Statistical Consultant, Imperial College) for support in conducting statistical analyses, Imperial undergraduate examinations office and all actors who participated. In addition, we thank Dr Paul Lecat (Medical Director, Wasson Center NEOUCOM, Ohio, USA) for expert advice on use of the VS. The research was carried out in Imperial College London, UK.

Declaration of interest: The authors report no conflict of interests. The authors alone are responsible for the content and writing of this article.

Funding

This study was supported by a Teaching Research Grant from Imperial's Centre for Educational Development, and by the London Deanery.

Ethical approval

Ethical approval for this study was obtained from Imperial College Research Ethics Committee.

References

  • Altman DG. Practical statistics for medical research. Chapman and Hall, London 1991; 256
  • Black SA, Nestel DF, Horrocks EJ, Harrison RH, Jones N, Wetzel CM, Wolfe JH, Kneebone RL, Darzi AW. Evaluation of a framework for case development and simulated patient training for complex procedures. Simul Healthc 2006; 1(2)66–71, Available from: PM:19088579
  • Castilano A, Haller N, Goliath C, Lecat P. The ventriloscope: ‘Am I hearing things?'. Med Teach 2009; 31(3)e97–e101
  • Cline DM. Valvular emergencies. Emergency medicine: A comprehensive study guide, 6th, JE Tintinalli, GD Kelen, JS Stapczynski. McGraw-Hill, New York 2004; 54
  • Cohen J. A power primer. Psychol Bull 1992; 112(1)155–159, Available from: PM:19565683
  • Cresswell JW. Qualitative inquiry and research design. Sage Publications, Thousand Oaks, CA 1998; 47–72
  • Dauphinee WD. Assessing clinical performance. Where do we stand and what might we expect? JAMA 1995; 274(9)741–743, Available from: PM:7650829
  • Department of Health. 2009. 150 years of the annual report of the chief medical officer: On the state of public health 2008. 2009. Available from: http://www.dh.gov.uk/dr_consum_dh/groups/dh_digitalassets/documents/digitalasset/dh_096231.pdf
  • Dillon GF, Boulet JR, Hawkins RE, Swanson DB. Simulations in the United States Medical Licensing Examination (USMLE). Qual Saf Health Care 2004; 13(Suppl 1)i41–i45, Available from: PM:15465954
  • Gordon MS, Ewy GA, DeLeon AC, Jr, Waugh RA, Felner JM, Forker AD, Gessner IH, Mayer JW, Patterson D. “Harvey,” the cardiology patient simulator: Pilot studies on teaching effectiveness. Am J Cardiol 1980a; 45(4)791–796, Available from: PM:7361670
  • Gordon MS, Ewy GA, Felner JM, Forker AD, Gessner I, McGuire C, Mayer JW, Patterson D, Sajid A, Waugh RA. Teaching bedside cardiologic examination skills using “Harvey”, the cardiology patient simulator. Med Clin North Am 1980b; 64(2)305–313, Available from: PM:6155573
  • Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ 1979; 13(1)41–54, Available from: PM:763183
  • Hatala R, Issenberg SB, Kassen B, Cole G, Bacchus CM, Scalese RJ. Assessing cardiac physical examination skills using simulation technology and real patients: A comparison study. Med Educ 2008; 42(6)628–636
  • Hatala R, Kassen BO, Nishikawa J, Cole G, Issenberg SB. Incorporating simulation technology in a canadian internal medicine specialty examination: A descriptive report. Acad Med 2005; 80(6)554–556, Available from: PM:15917358
  • Houck WA, Soares-Welch CV, Montori VM, Li JT. Learning the thyroid examination – A multimodality intervention for internal medicine residents. Teach Learn Med 2002; 14(1)24–28, Available from: PM:11865745
  • Issenberg SB, McGaghie WC, Gordon DL, Symes S, Petrusa ER, Hart IR, Harden RM. Effectiveness of a cardiology review course for internal medicine residents using simulation technology and deliberate practice. Teach Learn Med 2002; 14(4)223–228, Available from: PM:12395483
  • Issenberg SB, McGaghie WC, Hart IR, Mayer JW, Felner JM, Petrusa ER, Waugh RA, Brown DD, Safford RR, Gessner IH, et al. Simulation technology for health care professional skills training and assessment. JAMA 1999; 282(9)861–866, Available from: PM:10478693
  • Issenberg SB, McGaghie WC, Petrusa ER, Lee GD, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach 2005; 27(1)10–28, Available from: PM:16147767
  • Kneebone R. Perspective: Simulation and transformational change: The paradox of expertise. Acad Med 2009; 84(7)954–957, Available from: PM:19550196
  • Kneebone R, Kidd J, Nestel D, Asvall S, Paraskeva P, Darzi A. An innovative model for teaching and learning clinical procedures. Med Educ 2002; 36(7)628–634, Available from: PM:12109984
  • Kneebone R, Nestel D, Wetzel C, Black S, Jacklin R, Aggarwal R, Yadollahi F, Wolfe J, Vincent C, Darzi A. The human face of simulation: Patient-focused simulation training. Acad Med 2006; 81(10)919–924, Available from: PM:16985358
  • Kneebone RL, Kidd J, Nestel D, Barnet A, Lo B, King R, Yang GZ, Brown R. Blurring the boundaries: Scenario-based simulation in a clinical setting. Med Educ 2005; 39(6)580–587, Available from: PM:15910434
  • Lecat P, Lecat's Ventriloscope website http://www.ventriloscope.com/. 13-2-2010
  • Lieberman SA, Frye AW, Litwins SD, Rasmusson KA, Boulet JR. Introduction of patient video clips into computer-based testing: Effects on item statistics and reliability estimates. Acad Med 2003; 78(Suppl 10)S48–S51, Available from: PM:14557094
  • Mangione S, Nieman LZ. Cardiac auscultatory skills of internal medicine and family practice trainees. A comparison of diagnostic proficiency. JAMA 1997; 278(9)717–722, Available from: PM:9286830
  • McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ 2010; 44(1)50–63, Available from: PM:20078756
  • Millos RT, Gordon DL, Issenberg SB, Reynolds PS, Lewis SL, McGaghie WC, Petrusa ER. Development of a reliable multimedia, computer-based measure of clinical skills in bedside neurology. Acad Med 2003; 78(Suppl 10)S52–S54, Available from: PM:14557095
  • Morgan PJ, Cleave-Hogg D. A worldwide survey of the use of simulation in anesthesia. Can J Anaesth 2002; 49(7)659–662, Available from: PM:12193481
  • Newble DI, Swanson DB. Psychometric characteristics of the objective structured clinical examination. Med Educ 1988; 22(4)325–334, Available from: PM:3173161
  • Ozuah PO, Curtis J, Dinkevich E. Physical examination skills of US and international medical graduates. JAMA 2001; 286(9)1021, Aavailable from: PM:11559281
  • Peitzman SJ, McKinley D, Curtis M, Burdick W, Whelan G. Performance of international medical graduates in techniques of physical examination, with a comparison of US Citizens and non-US citizens. Acad Med 2000; 75(Suppl 10)S115–S117, Available from: PM:11031193
  • Strauss AL, Corbin J. Basics of qualitative research. Sage Publications, Newbury Park, CA 1990
  • Weller JM. Simulation in undergraduate medical education: Bridging the gap between theory and practice. Med Educ 2004; 38(1)32–38, Available from: PM:14962024

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.