799
Views
13
CrossRef citations to date
0
Altmetric
Original Articles

Plus ça change, plus c’est pareil: Making a continued case for the use of MCQs in medical education

, &

References

  • Albanese MA. 1993. Type K and Other complex multiple-choice items: an analysis of research and item properties. Educ Meas Issues Pract. 12:28–33.
  • Albanese M, Case SM. 2016. Progress testing: critical analysis and suggested practices. Adv Health Sci Educ Theory Pract. 21:221–234.
  • Anderson J. 1979. For multiple choice questions. Med Teach. 1:37–42.
  • Bandaranayake R, Payne J, White S. 1999. Using multiple response true-false multiple choice questions. Aust N Z J Surg. 69:311–315.
  • Beullens J, Damme BV, Jaspaert H, Janssen PJ. 2002. Are extended-matching multiple-choice items appropriate for a final test in medical education? Med Teach. 24:390–395.
  • Beullens J, Struyf E, Van Damme B. 2005. Do extended matching multiple-choice questions measure clinical reasoning? Med Educ. 39:410–417.
  • Blake JM, Norman GR, Keane DR, Mueller CB, Cunnington J, Didyk N. 1996. Introducing progress testing in McMaster University’s problem-based medical curriculum: psychometric properties and effect on learning. Acad Med. 71:1002–1007.
  • Bloom B, Engelhart M, Furst E, Hill W, Krathwohl D. 1956. Taxonomy of educational objectives: the classification of educational goals. Handbook I: Cognitive domain. New York (NY): David McKay Company.
  • Case SM, Swanson DB. 2002. Constructing written test questions for the basic and clinical sciences. 3rd ed. Philadelphia (PA): National Board of Medical Examiners (NBME). [accessed 2018 May 25]. http://www.medbev.umontreal.ca/docimo/DocSource/NBME_MCQ.pdf
  • Case SM, Swanson DB, Ripkey DR. 1994. Comparison of items in five-option and extended-matching formats for assessment of diagnostic skills. Acad Med. 69:S1–S3.
  • Coderre SP, Harasym P, Mandin H, Fick G. 2004. The impact of two multiple-choice question formats on the problem-solving strategies used by novices and experts. BMC Med Educ. 4:23
  • Cuddy MM, Young A, Gelman A, Swanson DB, Johnson DA, Dillon GF, Clauser BE. 2017. Exploring the relationships between USMLE performance and disciplinary action in practice: a validity study of score inferences from a licensure examination. Acad Med. 92:1780–1785.
  • De Champlain A, Roy M, Tian F, Qin S, Brailovsky C. 2014. Predicting family medicine specialty certification status using standardized measures for a sample of international medical graduates engaged in a practice-ready assessment pathway to provisional licensure. J Med Regul. 100:8–16.
  • De Champlain AF, Boulais AP, Dallas A. 2016. Calibrating the Medical Council of Canada’s qualifying examination part I using an integrated item response theory framework: a comparison of models and designs. J Educ Eval Health Prof. 13:6.
  • Dennick R, Wilkinson S, Purcell N. 2009. Online eAssessment: AMEE Guide no. 39. Med Teach. 31:192–206.
  • Dijksterhuis MGK, Scheele F, Schuwirth LWT, Essed GGM, Nijhuis JG, Braat DDM. 2009. Progress testing in postgraduate medical education. Med Teach. 31:e464–e468.
  • Downing SM. 2005. The effects of violating standard item writing principles on tests and students: The consequences of using flawed test items on achievement examinations in medical education. Adv Health Sci Educ Theory Pract. 10:133–143.
  • Downing SM. 1992. True‐false, alternate‐choice, and multiple‐choice items. Educ Meas Issues Pract. 11:27–30.
  • Downing SM, Baranowski RA, Grosso LJ, Norcini JJ. 1995. Item type and cognitive ability measured: the validity evidence for multiple true-false items in medical specialty certification. Appl Meas Educ. 8:187–207.
  • Ebel RL. 1983. The practical validation of tests of ability. Educ Meas Issues Pract. 2:7–10.
  • Eijsvogels TMH, van den Brand TL, Hopman MTE. 2013. Multiple choice questions are superior to extended matching questions to identify medicine and biomedical sciences students who perform poorly. Perspect Med Educ. 2:252–263.
  • Farmer EA, Page G. 2005. A practical guide to assessing clinical decision-making skills using the key features approach. Med Educ. 39:1188–1194.
  • Frisbie DA, Ebel RL. 1972. Comparative reliabilities and validities of true-false and multiple choice tests. Paper presented at the 56th annual meeting of AERA; April 3–7; Chicago, IL.
  • Gandhi IS, Ghosh MN. 1978. A trial of multiple choice examination in pharmacology. Med Educ. 12:287–289.
  • Gierl MJ, Lai H. 2013. Evaluating the quality of medical multiple-choice items created with automated processes. Med Educ. 47:726–733.
  • Gierl MJ, Lai H, Pugh D, Touchie C, Boulais AP, De Champlain A. 2016. Evaluating the psychometric properties of generated test items. Appl Meas Educ. 29:196–210.
  • Gierl MJ, Lai H, Turner SR. 2012. Using automatic item generation to create multiple-choice test items. Med Educ. 46:757–765.
  • Haladyna TM, Downing SM. 1989. Validity of a taxonomy of multiple-choice item-writing rules. Appl Meas Educ. 2:51–78.
  • Haladyna TM, Downing SM, Rodriguez MC. 2002. A review of multiple-choice item-writing guidelines for classroom assessment. Appl Meas Educ. 15:309–333.
  • Heist BS, Gonzalo JD, Durning S, Torre D, Elnicki DM. 2014. Exploring clinical reasoning strategies and test-taking behaviors during clinical vignette style multiple-choice examinations: A mixed methods study. J Grad Med Educ. 6:709–714.
  • Hodges B. 2013. Assessment in the post-psychometric era: learning to love the subjective and collective. Med Teach. 35:564–568.
  • Holmboe E, Sherbino J, Long D, Swing S, Frank J. 2010. The role of assessment in competency-based medical education. Med Teach. 32:676–682.
  • Holmboe ES, Wang Y, Meehan TP, Tate JP, Ho SY, Starkey KS, Lipner RS. 2008. Association between maintenance of certification examination scores and quality of care for medicare beneficiaries. Arch Intern Med. 168:1396–1403.
  • Huxham GJ, Naeraa N. 1980. Is Bloom’s Taxonomy reflected in the response pattern to MCQ items? Med Educ. 14:23–26.
  • Julian E. 2005. Validity of the medical college admission test for predicting medical school performance. Acad Med. 80:910–917.
  • Kimura T. 2017. The impacts of computer adaptive testing from a variety of perspectives. J Educ Eval Health Prof. 14:12
  • Lai H, Gierl MJ, Pugh D, Touchie C, Boulais AP, De Champlain A. 2016. Using automatic item generation to improve the quality of MCQ distractors. Teach Learn Med. 28:166–173.
  • Lievens F, Buyse T, Sackett PR. 2005. The operational validity of a video-based situational judgment test for medical college admissions: illustrating the importance of matching predictor and criterion construct domains. J Appl Psychol. 90:442–452.
  • Luckin R. 2017. Towards artificial intelligence-based assessment systems. Nat Hum Behav. 1:0028.
  • McCoubrie P. 2004. Improving the fairness of multiple-choice questions: a literature review. Med Teach. 26:709–712.
  • Miller GE. 1990. The assessment of clinical skills/competence/performance. Acad Med. 65:S63–S67.
  • Newble DI, Baxter A, Elmslie RG. 1979. A comparison of multiple-choice tests and free-response tests in examinations of clinical competence. Med Educ. 13:263–268.
  • Newble DI, Hoare J, Baxter A. 1982. Patient management problems Issues of validity. Med Educ. 16:137–142.
  • Norcini JJ, Swanson DB, Grosso LJ, Webster GD. 1985. Reliability, validity and efficiency of multiple choice question and patient management problem item formats in assessment of clinical competence. Med Educ. 19:238–247.
  • Norman G, Neville A, Blake J, Mueller B. 2010. Assessment steers learning down the right road: impact of progress testing on licensing examination performance. Med Teach. 32:496–499.
  • Page G, Bordage G, Allen T. 1995. Developing key-feature problems and examinations to assess clinical decision-making skills. Acad Med. 70:194–201.
  • Pugh D, De Champlain A, Gierl M, Lai H, Touchie C. 2016. Using cognitive models to develop quality multiple-choice questions. Med Teach. 38:838–843.
  • Pugh D, Regehr G. 2016. Taking the sting out of assessment: is there a role for progress testing? Med Educ. 50:721–729.
  • Pugh D, De Champlain A, Gierl M, Lai H, Touchie C. Can automated item generation be used to develop high quality MCQs that assess application of knowledge? In preparation.
  • Rodriguez M. 2005. Three options are optimal for multiple-choice items: a meta-analysis of 80 years of research. Educ Meas Issues Pract. 24:3–13.
  • Roy B, Ripstein I, Perry K, Cohen B. 2016. Predictive value of grade point average (GPA), Medical College Admission Test (MCAT), internal examinations (Block) and National Board of Medical Examiners (NBME) scores on Medical Council of Canada qualifying examination part I (MCCQE-1) scores. Can Med Educ J. 7:e47–e56.
  • Schneid SD, Armour C, Park YS, Yudkowsky R, Bordage G. 2014. Reducing the number of options on multiple-choice questions: response time, psychometrics and standard setting. Med Educ. 48:1020–1027.
  • Schuwirth LW, van der Vleuten CP, Donkers HH. 1996. A closer look at cueing effects in multiple-choice questions. Med Educ. 30:44–49.
  • Schuwirth LW, Verheggen MM, van der Vleuten CP, Boshuizen HP, Dinant GJ. 2001. Do short cases elicit different thinking processes than factual knowledge questions do? Med Educ. 35:348–356.
  • Schuwirth LW, van der Vleuten CP. 2004. Different written assessment methods: what can be said about their strengths and weaknesses? Med Educ. 38:974–979.
  • Schuwirth LW, van der Vleuten CP. 2011. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach. 33:478–485.
  • Skakun E, Maguire T, Cook D. 1994. Strategy choices in multiple-choice items. Acad Med. 69:S7–S9.
  • Surry LT, Torre D, Durning SJ. 2017. Exploring examinee behaviours as validity evidence for multiple-choice question examinations. Med Educ. 51:1075–1085.
  • Swanson DB, Case SM, Melnick DE, Volle RL. 1992. Impact of the USMLE step 1 on teaching and learning of the basic biomedical sciences. United States Medical Licensing Examination. Acad Med. 67:553–556.
  • Swanson DB, Ripkey DR, Case SM. 1996. Relationship between achievement in basic science coursework and performance on 1994 USMLE Step 1. 1994-95 Validity Study Group for USMLE Step 1/2 Pass/Fail Standards. Acad Med.;71:S28–S30.
  • Swanson DB, Holtzman KZ, Allbee K, Clauser BE. 2006. Psychometric characteristics and response times for content-parallel extended-matching and one-best-answer items in relation to number of options. Acad Med. 81:S52–S55.
  • Tamblyn R, Abrahamowicz M, Brailovsky C, Grand’Maison P, Lescop J, Norcini J, Girard N, Haggerty J. 1998. Association between licensing examination scores and resource use and quality of care in primary care practice. JAMA. 280:989–996.
  • Tamblyn R, Abrahamowicz M, Dauphinee W, Hanley JA, Norcini J, Girard N, Grand’Maison P, Brailovsky C. 2002. Association between licensure examination scores and practice in primary care. JAMA. 288:3019–3026.
  • Touchie C. 2010. Guidelines for the development of multiple-choice questions. Medical Council of Canada (MCC); Ottawa, Canada. [accessed 2018 May 25]. https://mcc.ca/media/Multiple-choice-question-guidelines.pdf.
  • Veloski JJ, Rabinowitz HK, Robeson MR. 1993. A solution to the cueing effects of multiple choice questions: the Un-Q format. Med Educ. 27:371–375.
  • Weiss DJ, Kingsbury GG. 1984. Application of computerized adaptive testing to educational problems. J Educ Meas. 21:361–375.
  • Wendt A, Harmes JC. 2009a. Developing and evaluating innovative items for the NCLEX: part 2, item characteristics and cognitive processing. Nurse Educ. 34:109–113.
  • Wendt A, Harmes JC. 2009b. Evaluating innovative items for the NCLEX, part I: usability and pilot testing. Nurse Educ. 34:56–59.
  • Wenghofer E, Klass D, Abrahamowicz M, Dauphinee D, Jacques A, Smee S, Blackmore D, Winslade N, Reidel K, Bartman I, Tamblyn R. 2009. Doctor scores on national qualifying examinations predict quality of care in future practice. Med Educ. 43:1166–1173.
  • Zhao X, Oppler S, Dunleavy D, Kroopnick M. 2010. Validity of four approaches of using repeaters’ MCAT scores in medical school admissions to predict USMLE Step 1 total scores. Acad Med. 85:S64–S67.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.