108
Views
8
CrossRef citations to date
0
Altmetric
Original Research

Do medical students’ scores using different assessment instruments predict their scores in clinical reasoning using a computer-based simulation?

&
Pages 135-141 | Published online: 20 Feb 2015
 

Abstract

Purpose

The development of clinical problem-solving skills evolves over time and requires structured training and background knowledge. Computer-based case simulations (CCS) have been used for teaching and assessment of clinical reasoning skills. However, previous studies examining the psychometric properties of CCS as an assessment tool have been controversial. Furthermore, studies reporting the integration of CCS into problem-based medical curricula have been limited.

Methods

This study examined the psychometric properties of using CCS software (DxR Clinician) for assessment of medical students (n=130) studying in a problem-based, integrated multisystem module (Unit IX) during the academic year 2011–2012. Internal consistency reliability of CCS scores was calculated using Cronbach’s alpha statistics. The relationships between students’ scores in CCS components (clinical reasoning, diagnostic performance, and patient management) and their scores in other examination tools at the end of the unit including multiple-choice questions, short-answer questions, objective structured clinical examination (OSCE), and real patient encounters were analyzed using stepwise hierarchical linear regression.

Results

Internal consistency reliability of CCS scores was high (α=0.862). Inter-item correlations between students’ scores in different CCS components and their scores in CCS and other test items were statistically significant. Regression analysis indicated that OSCE scores predicted 32.7% and 35.1% of the variance in clinical reasoning and patient management scores, respectively (P<0.01). Multiple-choice question scores, however, predicted only 15.4% of the variance in diagnostic performance scores (P<0.01), while students’ scores in real patient encounters did not predict any of the CCS scores.

Conclusion

Students’ scores in OSCE are the most important predictors of their scores in clinical reasoning and patient management using CCS. However, real patient encounter assessment does not appear to test a construct similar to what is tested in CCS.

Author contributions

MF: coordinated the computer-based clinical reasoning examination for students. Collected the data and conducted the statistical analysis. Drafted the first version of the manuscript. SEK: initiated the study idea and designed the study protocol. Revised the manuscript critically for important intellectual content. Both authors approved the final form of the manuscript.

Disclosure

The authors report no conflicts of interest in this work. The authors report no external funding source for this study. The study has been certified as exempt from IRB review.