2,793
Views
14
CrossRef citations to date
0
Altmetric
Web Paper

Computer-based testing of the modified essay question: the Singapore experience

, , , , , & show all
Pages e261-e268 | Published online: 03 Jul 2009

Abstract

Background: The modified essay question (MEQ), featuring an evolving case scenario, tests a candidate's problem-solving and reasoning ability, rather than mere factual recall. Although it is traditionally conducted as a pen-and-paper examination, our university has run the MEQ using computer-based testing (CBT) since 2003.

Aims: We describe our experience with running the MEQ examination using the IVLE, or integrated virtual learning environment (https://ivle.nus.edu.sg), provide a blueprint for universities intending to conduct computer-based testing of the MEQ, and detail how our MEQ examination has evolved since its inception.

Methods: An MEQ committee, comprising specialists in key disciplines from the departments of Medicine and Paediatrics, was formed. We utilized the IVLE, developed for our university in 1998, as the online platform on which we ran the MEQ. We calculated the number of man-hours (academic and support staff) required to run the MEQ examination, using either a computer-based or pen-and-paper format.

Results: With the support of our university's information technology (IT) specialists, we have successfully run the MEQ examination online, twice a year, since 2003. Initially, we conducted the examination with short-answer questions only, but have since expanded the MEQ examination to include multiple-choice and extended matching questions. A total of 1268 man-hours was spent in preparing for, and running, the MEQ examination using CBT, compared to 236.5 man-hours to run it using a pen-and-paper format. Despite being more labour-intensive, our students and staff prefer CBT to the pen-and-paper format.

Conclusions: The MEQ can be conducted using a computer-based testing scenario, which offers several advantages over a pen-and-paper format. We hope to increase the number of questions and incorporate audio and video files, featuring clinical vignettes, to the MEQ examination in the near future.

Introduction

The modified essay question (MEQ) was developed in the late 1960s to test the reasoning and decision-making abilities of candidates, rather than mere factual recall or understanding of principles (Feletti & Smith Citation1986). The problem-based approach of the MEQ has since been adopted by universities in many countries (Newble et al. Citation1979, Citation1981; Feletti Citation1980; Feletti & Gillies Citation1982; Irwin & Bamber Citation1982; Weinman Citation1984; Rabinowitz Citation1985, Citation1987; Stratford & Pierce-Fenn Citation1985). Featuring a case history that is sequentially revealed, the MEQ is context-dependent, as the respondent is required to recall facts and apply theoretical knowledge, relating them to the particular circumstances of the case under consideration (Knox Citation1989).

In its original form, the MEQ was pen-and-paper based, and consisted of a booklet of about 6 to 10 pages. Commencing with a brief scenario that outlined a clinical situation, space was provided for respondents to answer a series of questions, at the end of which they turned to the next page. Each new page began with relevant information to help the scenario unravel, and ended with questions pertaining to the new information given, continuing in this manner until the final page was reached. In the process of answering the MEQ, the respondent was taken through the steps of a consultation (), i.e. history taking, physical examination, interpretation of investigations and finally clinical management and communications (Knox Citation1989).

Figure 1. Algorithm showing the sequence in a modified question (MEQ), which features a gradually evolving scenario. In each section, clinical information or data is given, at the end of which the respondent answers short questions before proceeding to the next section.

Figure 1. Algorithm showing the sequence in a modified question (MEQ), which features a gradually evolving scenario. In each section, clinical information or data is given, at the end of which the respondent answers short questions before proceeding to the next section.

The MEQ allows candidates to be tested on data interpretation, with numerical data, tracings or graphs printed in the question booklet, or images such as clinical photographs or radiographs displayed or projected onto a screen in the examination room (Feletti & Engel Citation1980). Recently, Rabinowitz has described the successful incorporation of audiovisual data, with use of videotaped vignettes of patient-doctor consultations, into the MEQ (Rabinowitz Citation1985).

As information provided in the subsequent page could inadvertently clue candidates answering questions on the preceding page, it is essential for each new item of information to be set on a different page, and candidates prevented from turning the page until they have answered all the questions on it (Knox Citation1989). One way of enforcing this is to physically separate the “data” from the question booklet, information being provided piecemeal and sequentially, projected on overhead projectors only after completed answer scripts from previous sections have been surrendered (e.g. placed on the floor beside the desk) or collected by invigilators (The Board of Censors of the Royal College of General Practitioners Citation1971). Though effective as a security measure, this method is time consuming, labour-intensive, and disruptive to candidates sitting the examination. Intuitively, the variability of viewing angle and distance from the screen will invariably result in those candidates sitting far away from, or at awkward angles to, the screen complaining that they are disadvantaged by poor image quality.

In this article, we share our experience in conducting computer-based testing of the MEQ, and detail the changes the examination has undergone in the last 3 years at our university.

The MEQ: applicability, reliability and validity

The MEQ was developed by Hodgkin and Knox for the examination of the Royal College of General Practitioners (Hodgkin & Knox Citation1975), and has since been used to assess general practitioner trainees at various points in their training (Knox Citation1989). It has been successfully adopted to evaluate the communications skills of pre-clinical undergraduates (Weinman Citation1984), as part of the clinical clerkship for third-year undergraduates (Rabinowitz Citation1987), and for the final year undergraduate examinations (Irwin & Bamber Citation1982). Irwin and Bamber found that the MEQ is able to measure the three levels of cognitive ability described by Buckwalter (recall or recognition of isolated information, data interpretation and problem solving) (Buckwalter et al. Citation1981), as well as the five levels of cognitive processing specified by Bloom (knowledge, comprehension, analysis, synthesis and evaluation) (Bloom Citation1956), with the caveat that examiners constructing the MEQ are mindful of the cognitive levels being assessed (Irwin & Bamber Citation1982). The MEQ has been shown to be a reliable assessment tool (Feletti Citation1980; Newble et al. Citation1981; Feletti & Gillies Citation1982), with Cronbach's reliability coefficients (alpha-60) ranging from 0.43 for psychiatric disorders to 0.90 for haematopoeitic and endocrine disorders (Feletti Citation1980). A reliability coefficient of ≥0.70 is usually considered acceptable (Haher et al. Citation1999). It has likewise been shown to have content (Feletti Citation1980; Newble et al. Citation1981) and construct (Newble et al. Citation1979, Citation1981) validity, as well as correlating with subsequent clinical performance (Irwin & Bamber Citation1982; Stratford & Pierce-Fenn Citation1985).

The MEQ is not without problems, however. Students and academic staff at the Newcastle Medical School in Australia, whilst acknowledging the strengths of the MEQ as an assessment tool, were dissatisfied that the institution used it as its main assessment modality (Feletti & Smith Citation1986). These criticisms of single-modality assessments were echoed in a recent study by Wilkinson & Frampton who concluded that comprehensive medical undergraduate assessments (in the form of MEQ, multiple choice and essay questions, as well as the OSCE, or objective structured clinical examination) were better predictors of clinical performance than assessments utilizing a single method of assessment alone (Wilkinson & Frampton Citation2004).

Experience at the national university of Singapore

In addition to long and short clinical cases of the final-year Medicine examination, our university utilizes the OSCE and a multi-modality written assessment format, consisting of short and long essay questions, multiple-choice questions as well as a clinical vignette-type question paper (Paper 3), similar to that used by the Royal College of Physicians. Each of the 30 clinical vignettes consists of a clinical image (in the form of photograph, electrocardiogram or radiograph) and brief clinical scenario, followed by a multiple-choice question. To include a reasoning and clinical decision-making dimension to the final-year Medicine examination, we incorporated the MEQ in 2003.

Integrated virtual learning environment, IVLE

The IVLE was introduced in 1998 as an in-house, online courseware management system, to allow academic staff to distribute course materials, schedule and administer lessons and assessments, and to interact online with undergraduate and postgraduate students. Since its inception, the IVLE has proved popular amongst staff and students. This is consistent with reports from other universities, which describe the successful application of virtual learning environments to problem-based learning (Shyu et al. Citation2004), virtual clinical rounds (Schultze-Mosgau et al. Citation2004) and distance learning (Rick et al. Citation2003).

We opted to conduct the MEQ examination using IVLE, rather than as a pen-and-paper (PNP) examination. Computer-based testing (CBT) is a popular testing modality, with large-scale professional examinations such as the United States Medical Licensing Examination (USMLE) adopting a CBT format and replacing the written PNP format in 1999 (Dillon et al. Citation2004). Conversion from PNP to CBT has not been found to affect candidate performance (Bugbee Citation1996), although those unfamiliar with computer use have reported increased anxiety during the examination (Hedl et al. Citation1973; Vrabel Citation2004). CBT offers several advantages over PNP testing, namely convenience of scheduling, incorporation of high-resolution images () and multimedia, the ability to score exam papers instantly (when applied to multiple choice questions) and enhanced security (Vrabel Citation2004). Furthermore, surveys of candidates have indicated a preference for CBT over PNP (Vrabel Citation2004).

Figure 2. Excerpt of a page from MEQ on IVLE. Note the high-resolution image, which takes up from ½ to ¾ of the screen (depending on the settings). Upon clicking on the “submit” icon, a message pops up, reminding the candidate that it will not be possible to return to the previous page after submission of the answer. Note the timer on the top right-hand corner of the screen.

Figure 2. Excerpt of a page from MEQ on IVLE. Note the high-resolution image, which takes up from ½ to ¾ of the screen (depending on the settings). Upon clicking on the “submit” icon, a message pops up, reminding the candidate that it will not be possible to return to the previous page after submission of the answer. Note the timer on the top right-hand corner of the screen.

IVLE is eminently suitable as an online examinations tool. The requirement for students to log in using their matriculation numbers and passwords ensures that only eligible candidates do so to sit the examination. The inbuilt calendar and clock allows us to pre-set start and stop times for publication of the examination (), and set time limits for individual questions. We restricted candidates to attempt the examination once only, though IVLE allows an unlimited number of attempts when used for teaching or formative assessment purposes.

Setting modified essay questions

We formulated an MEQ committee with specialists from the major disciplines of Paediatrics and Adult Medicine. Our committee met weekly, crafting 4 MEQs, one in each core system. Only one question was required for the examination, the final choice being made by the independent chief examiner after the final selection of essay questions. This was to ensure that all core subjects involving the major systems were covered in the essay-type questions (traditional short and long essays, as well as MEQ).

Potential problems with CBT: Information technology (IT) support

We scrutinized the IT infrastructure in tandem with the formulation of the MEQs, setting up meetings with the IT support team of IVLE, to confirm that it could, indeed, support running an online examination for 230 final-year medical students, all on the same day. At these meetings, we sought to pre-empt potential errors “on the ground” by going over all practical aspects of running the examination using IVLE. Several issues made themselves obvious, ranging from hardware issues (large numbers of personal computers, housed in secure facilities, to accommodate our candidates), security issues for the examination, the capacity of IVLE to support the examination, to the need for contingency plans in the event of an electrical or IT network failure.

Hardware issues

The large number of candidates in each cohort (numbering between 200 and 250) necessitated use of an examinations centre with the requisite number of personal computers for all candidates, including back-up computers. Although our university boasts 5 large computer clusters, none housed more than 120 computers in one location. Fortuitously, our main computer centre features 2 laboratories with 100 and 80 personal-computer capacity respectively, sited in adjoining rooms. We thus divided the candidates into 2 shifts, each with between 110 to 120 candidates. We created 40 guest accounts, in the event that candidates failed to log in with their stipulated accounts, or inadvertently logged themselves out of the system during the examination.

Security issues

Security issues thus became paramount. We were well aware that our IT-savvy students had the capability to utilize their “high-tech” accoutrements to capture images from the computer screen onto their mobile phones and disseminate them via multimedia messaging service (MMS) or email. In addition, they could use mobile phones, blackberry or other wireless devices to email their friends, surf the internet to trawl for answers or “leak” questions, rendering our examination an “open book” one, insecure to boot.

During our brainstorming sessions, we resolved to pre-empt security leaks by plugging potential loopholes. We made arrangements for supervized quarantine of the second batch of candidates in nearby facilities, 1 hour before their scheduled examination, barring all digital image capture devices such as cameras and mobile phones, digital storage media such as thumb drives and hard disks, and communication devices including alphanumeric pagers and blackberry devices from the examination hall. To prevent our candidates from utilising the internet for email and surfing purposes, we disabled direct access to the internet, requiring that they log on to the MEQ on IVLE using a proxy server (Virtual Private Network, VPN) in each computer laboratory. Thus, whilst the IT service needs of the rest of the university were met (i.e. staff and students were able to fully utilise IVLE unhindered), we managed to ensure the security of our MEQ examination. We were heartened and reassured by the work of Kreiter et al. (Citation2003) showing that conducting a computer-based examination in shifts did not compromise the security of the examination.

Capacity of IVLE to handle the CBT load

Concerns were raised about the capacity of IVLE to handle a computer-based examination featuring high-resolution images, in which at least 120 students would be simultaneously logged in. To compound our worries, it was not possible to prevent staff and students from other faculties from utilizing IVLE, in view of scheduling conflicts. We co-opted staff from the Centre for Instructional Technology (CIT), who managed IVLE, to ensure that our IT needs were met, and that the demands of our online examination did not conflict with the IT service needs of the rest of the university.

It was determined that multimedia files (still images of clinical slides, radiographs and electrocardiograms) should not exceed 5 megabytes in size in order to avoid overloading the system. We envisaged that exceeding the capacity of IVLE could either cause “slow down” of the system, or worse, network failure. The limits in file size did not pose a problem initially, as we had not intended to utilize video (e.g. of eye movements or movement disorders) or audio (e.g. of heart or breath sounds) vignettes/snippets in our first few runs of the examination.

Dealing with the unexpected: contingency plans

It was imperative that we create, model and test out contingency plans in the event of the unexpected. We devised back-up plans for 3 major disaster scenarios, and had IT support staff on-hand in each examinations hall to troubleshoot both major and minor events. Intuitively, the issue that was paramount to us was the sanctity of candidate answers. We worked out fail-safe mechanisms within IVLE to ensure that answers keyed were automatically saved and backed up, even in the event of time-outs (at the end of the stipulated 30-minute duration for each question, at which time it became impossible for candidates to continue keying in answers to the questions) or electrical/network failures.

Major disasters we envisaged included electrical and network failure, prior to (“no-go”) or during (“interrupted”) the examination. In either event, we opted to revert to traditional PNP testing. In the event of a “no-go” event, we booked a lecture theatre, capable of accommodating all candidates, and prepared pre-printed question and answer booklets, preferring to project the questions and data image slides onto a large screen at the front of the lecture theatre. Although we were aware of the disadvantages in terms of viewing distances and angles, which could potentially limit visibility for those seated far away or at awkward angles from the screen, we were certain that this contingency plan would work, as we had been conducting our Paper 3 examination in this way for the past 2 decades. An “interrupted” examination would, we realized, prove more complicated, as the candidates would have seen some part, if not most, of the examination at the point of interruption. Our contingency plans, we felt, would have to be varied depending on the stage at which the interruption occurred. We had already set into place systems to ensure that all submitted answers would be saved, as would any answers as yet unsubmitted, but on-screen at the time of the interruption. Two options were possible: i.e. to (1) stop the examination at whatever point the interruption occurred and to continue with a PNP examination from that point on or (2) conduct a PNP examination featuring 1 or 2 back-up questions that the candidates had not seen. We realized the problems that might arise owing to an important difference between the conduct of the two examinations, in that a CBT examination allowed the students to proceed at their own pace, whereas such a luxury would prove impossible in a PNP examination. We were heartened by the fact that each candidate's answers would automatically be captured, no matter the stage at which the interruption occurred. As such, even if a candidate tried to answer already-submitted (and saved) questions in PNP format, we would only accept hard-copy answers subsequent to the interruption, merging them with the already-saved answers from the interrupted examination. Whilst this would create a multitude of problems for the invigilators and those marking examination scripts, we were confident that the security and confidentiality of the examination would not be breached. The second option (taking a PNP examination with totally new questions) addressed one of our concerns–that of security and confidentiality, as we could not prevent our candidates from discussing their answers in their journey to the new examination centre.

Mounting MEQs on IVLE

We uploaded our completed MEQs online, although only the final choice would be published on the day of the examination. To minimize security breaches, only one of us was tasked with mounting or uploading the questions onto IVLE, with the aid of a single IT officer. Though time-consuming, uploading of the images onto the data bank, and selection of options on conducting the examination on IVLE (single attempt, preventing candidates from returning to the previous section after submission of their answers and proceeding to the next section setting of a time limit), was fairly intuitive.

We uploaded images of sufficient size and clarity () to fully utilize the liquid crystal display (LCD) screens, in a bid to address the concerns of staff and students about image quality in this high-stakes examination. Allocation of marks was clearly indicated to candidates, and a countdown timer and clear pagination (Page _ of _ pages) were incorporated to provide candidates with a clear indication of their progress during the examination (). Readers may attempt a “trial run” of the MEQ examination by accessing https://ivle.nus.edu.sg, logging on as “gstjournal” using the password “9zd491” in the “guest” domain, and clicking on the assessment icon under “MEQTRIALRUN/MS”. Finally, we worked out how to download candidate answers as an Excel file and both sort and format them reliably into individual answer sheets, using a database management programme, Microsoft Access®.

Preparations for D-day

We reconnoitred the computer laboratories, to ensure that the physical set-up was conducive to conducting the examination. Our IT support staff combed through computer and systems specifications, to ensure that our modifications to IVLE and our requirement to log in via the VPN proxy server did not affect the running of IVLE for any of its users (including our candidates). Using a “dummy” MEQ, we conducted mock examinations using IVLE and our VPN proxy server.

As the computer laboratories were still running up to the afternoon of the day before the examination, we were unable to gain access to the personal computers in our designated computer laboratories until then. This resulted in our IT support staff spending a good part of the night before the examination going through 180 computers, to ensure that they were free of computer viruses, in good working order and set to allow access to IVLE using the designated VPN proxy server.

D-day: conducting the MEQ examination using CBT

On the designated day, we assigned academics and staff officers as invigilators, as well as representatives from IT support, to be on hand to ensure the smooth running of the examination.

Predictably, Murphy's law was obeyed by some candidates, who forgot their account names or passwords in the heat of the moment, or who logged out midway through the examination, requiring the services of our IT support staff and the use of our guest accounts. In order to ensure equity and fairness in marking the scripts, we double marked all our scripts, providing a marking template to script markers, who were allowed to discuss ambiguities in marking at the end of the marking session.

Lessons learned from conducting the MEQ examination using CBT

Despite strict instructions to keep answers short and to the point, lest they be penalized, our candidates managed to cram an amazing amount of information into the allotted 160 characters per short answer. This tendency, for candidates to digress and supply more than the requisite number of answers, has been highlighted by Feletti & Engel (Citation1980). We had forseen this possibility and warned our candidates that we would mark only the first relevant answer in each line, and would disregard other (superfluous) answers in the same line, a policy we assiduously complied with. In subsequent examinations, we only allowed the candidates to key in answers of up to 50 characters, which helped to resolve the problem.

Evolution of the MEQ examination

Since its inception in 2004, our MEQ has evolved. In addition to restricting candidate answers to 50 characters, which decreased the time and effort spent by academic staff in interpreting answers and according marks, we have since incorporated multiple choice and extended matching questions into the MEQ examination, providing our candidates with a clearer indication of what is expected of them and rendering the marking of examination scripts easier and more objective. We have also increased the number of questions to 2, and plans are afoot to increase the total number of MEQs to 5 and increase the weightage of the paper in our next final year examination.

As incorporation of audiovisual vignettes will add verisimilitude to the MEQ examination (Rabinowitz Citation1985), we are working towards expanding the capacity of IVLE to allow this.

MEQ examination using CBT: is it worth the effort?

Conducting the MEQ examination using CBT is, doubtless, an arduous effort. summarizes the manpower estimates (in man-hours) required to host the examination using CBT, and using PNP testing. Although CBT is more labour-intensive than PNP testing, informal feedback from academic staff involved in the setting and marking of the MEQ using CBT has generally been positive. 114 out of 213 (53.5%) of our final year medical students completed a survey at the end of their examinations, which revealed that 55% favoured the CBT format for conducting the MEQ examination, citing better clarity of images and neater (typed) answer scripts as the reasons underlying their preference. Interestingly, 80% preferred the CBT format for the Paper 3 examination (Lim et al. Citation2006).

Table 1A.  Calculation of man-hours required to run MEQ examination using IVLE

Table 1B.  Calculation of man-hours required to run MEQ examination using Pen-and Paper Testing

Conclusions

The MEQ is an effective modality by which medical schools can assess the reasoning and decision-making abilities of candidates. Although it has traditionally been conducted using PNP testing, our university has managed to effectively conduct it using CBT. With recent changes to the MEQ format, mixing short answers with multiple choice and extended matching questions, negative feedback from staff about the labour-intensive nature of conducting and marking the MEQ can now be countered. Admittedly, the conduct of the MEQ examination using CBT is labour-intensive, requiring intensive IT support. As our MEQ examination continues to evolve, we are confident that both staff and students will express support for and confidence in the computer-based testing method.

Acknowledgements

The authors would like to acknowledge the invaluable contributions of Ms Hui-Min Lim, Mr Za’aba A Rahman, Ms Fern-Eke Leo, the late Assoc Prof Rajiv Suri and the staff at CIT.

All authors are employed by the Yong Loo Lin School of Medicine, National University of Singapore.

Additional information

Notes on contributors

Erle Chuen-Hian Lim

ERLE CHUEN-HIAN LIM is consultant neurologist and Associate Professor of Medicine and Assistant Dean (Clinical, Admissions and Student Affairs).

Raymond Chee-Seong Seet

RAYMOND CHEE-SEONG SEET is Associate Consultant in Neurology and Assistant Professor of Medicine.

Vernon M. S. Oh

VERNON MIN-SEN OH is Senior Consultant in General Medicine and Clinical Pharmacology and Professor of Medicine.

Boon-Lock Chia

BOON-LOCK CHIA is Senior Consultant in Cardiology and Professor of Medicine.

Marion Aw

MARION AW is Consultant Gastroenterologist and Assistant Professor of Paediatrics.

Seng-Hock Quak

SENG-HOCK QUAK is Senior Consultant Gastroenterologist and Professor of Paediatrics.

Benjamin K. C. Ong

BENJAMIN KIAN-CHUNG ONG is Senior Consultant Neurologist and Associate Professor of Medicine.

References

  • Bloom BS. A Taxonomy of Educational Objectives, Handbook 1. Longman, London 1956
  • Buckwalter JA, Schumacher R, Albright JP, Cooper RR. Use of an educational taxonomy for evaluation of cognitive performance. J Med Educ 1981; 56: 115–21
  • Bugbee AC. The equivalence of paper-and-pencil and computer-based testing. J Res Compu Educ 1996; 28: 282–299
  • Dillon GF, Boulet JR, Hawkins RE, Swanson DB. Simulations in the United States Medical Licensing Examination (USMLE). Qua Safety Health Care 2004; 13: i41–i45
  • Feletti GI. Reliability and validity studies on modified essay questions. J Med Educ 1980; 55: 933–941
  • Feletti GI, Engel CE. The modified essay question for testing problem-solving skills. Med J Australia 1980; 1: 79–80
  • Feletti GI, Gillies AH. Developing oral and written formats for evaluating clinical problems-solving by medical undergraduates. J Med Educ 1982; 57: 874–876
  • Feletti GI, Smith EK. Modified essay questions: are they worth the effort?. Med Educ 1986; 20: 126–132
  • Haher TR, Gorup JM, Shin TM, Homel P, Merola AA, Grogan DP, Pugh L, Lowe TG, Murray M. Results of the Scoliosis Research Society instrument for evaluation of surgical outcome in adolescent idiopathic scoliosis. A multicenter study of 244 patients. Spine 1999; 24: 1435–1440
  • Hedl JJ, Jr, O'Neil HF, Jr, Hansen DN. Affective reactions toward computer-based intelligence testing. J Consult Clin Psych 1973; 40: 217–222
  • Hodgkin K, Knox JD. Problem Centred Learning. Churchill Livingstone, London 1975
  • Irwin WG, Bamber JH. The cognitive structure of the modified essay question. Med Educ 1982; 16: 326–331
  • Knox JD. What is … a modified essay question?. Med Teach 1989; 11: 51–57
  • Kreiter C, Peterson MW, Ferguson K, Elliott S. The effects of testing in shifts on a clinical in-course computerized exam. Med Educ 2003; 37: 202–204
  • Lim EC, Ong BK, Wilder-Smith EP, Seet RC. Computer-based versus pen-and-paper testing: Students’ perceptions. Ann Acad Med, Singapore 2006; 35: 599–603
  • Newble DI, Baxter A, Elmslie RG. A comparison of multiple-choice tests and free-response tests in examinations of clinical competence. Med Educ 1979; 13: 263–268
  • Newble DI, Hoare J, Elmslie RG. The validity and reliability of a new examination of the clinical competence of medical students. Med Educ 1981; 15: 46–52
  • Rabinowitz HK. Expansion of the modified essay question into an audiovisual format. J Med Educ 1985; 60: 883–885
  • Rabinowitz HK. The modified essay question: an evaluation of its use in a family medicine clerkship. Med Educ 1987; 21: 114–118
  • Rick C, Kearns MA, Thompson NA. The reality of virtual learning for nurses in the largest integrated health care system in the nation. Nurs Admin Quart 2003; 27: 41–57
  • Schultze-Mosgau S, Thorwarth WM, Grabenbauer GG, Amann K, Zielinski T, Lochner J, Zenk J. The concept of a clinical round as a virtual, interactive web-based, e-learning model for interdisciplinary teaching. Int J Comput Dentistry 2004; 7: 253–262
  • Shyu FM, Liang YF, Hsu WT, Luh JJ, Chen HS. A problem-based e-Learning prototype system for clinical medical education. Medinfo 2004; 11: 983–987
  • Stratford P, Pierce-Fenn H. Modified essay question. Phys Therapist 1985; 65: 1075–1079
  • The Board of Censors of the Royal College of General Practitioners: The modified essay question. 1971. Proc Roy College Gen Practitioners 1971; 21: 373–376
  • Vrabel M. Computerized versus paper-and-pencil testing methods for a nursing certification examination: a review of the literature. Comput, Inform, Nurs: CIN 2004; 22: 94–98
  • Weinman J. A modified essay question evaluation of pre-clinical teaching of communication skills. Med Educ 1984; 18: 164–167
  • Wilkinson TJ, Frampton CM. Comprehensive undergraduate medical assessments improve prediction of clinical performance. Med Educ 2004; 38: 1111–1116

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.