500
Views
0
CrossRef citations to date
0
Altmetric
Letter

Reliability and benefits of medical student peers in rating complex clinical skills; Common mistake

, MD, MSc, DSc, PhD, Postdoc

Dear Sir

I was interested to read the paper by Basehore PM and colleagues published in the March Citation2014 issue of Medical Teacher, where the authors investigated the reliability of student peers of the same level of training in rating complex clinical skills in a geriatric medicine based objective structured clinical exam (OSCE). They reported that the reliability of the OSCE was moderately strong (G-coefficient = 0.70) with strong correlations between peer and faculty ratings for the overall OSCE (r = 0.78, p = 0.001) and for each case (r = 0.70–0.85, p = 0.001; Basehore et al. Citation2014). This result has nothing to do with reliability and actually is one of the common mistakes in reliability analysis (Rothman et al. Citation2008). Reliability (repeatability or reproducibility) is often assessed by different statistical tests such as Pearson r, least square and paired t. ‘Mistakes in reliability analysis are common’ (Lawrence & Kuei Citation1989; Rothman et al. Citation2008).

For quantitative variables the Intra Class Correlation Coefficient (ICC) should be used. For qualitative variables the weighted kappa, which should be used with caution because kappa has its own limitation too (Lawrence & Kuei Citation1989; Rothman et al. Citation2008). It is crucial to know that there is no value of kappa that can be regarded universally as an indication good agreement. An important weakness of k value to assess agreement of a qualitative variable is that it depends upon the prevalence in each category. This means that it is be possible to have a different kappa value based on the same percentage of both concordant and discordant cells.

The authors point out in their conclusion, “peer raters” of the same level of training can provide accurate ratings of complex clinical tasks and can serve as an important resource in assessing student performance in an OSCE, but have not investigated the concordance of the pass/fail decisions with respect to individual candidates

Reliability (precision) and validity (accuracy) are two completely different and important methodological issues in all fields of researches. To assess the accuracy (validity) the following tests are used:-

  • sensitivity (the percentage with the disease who test positive, True Positives / (True Positives + False Negative)),

  • specificity (the percentage of healthy who test negative, True Negatives / (True Negatives + False Positive))

  • positive predictive value (PPV), (percentage of positive tests who actually are diseased, True Positives / (True Positives + False Positive)),

  • negative predictive value (NPV) (the percentage of negative tests who are healthy, True Negatives / (True Negatives + False Negative)),

  • likelihood ratio positive and likelihood ratio negative as well as diagnostic accuracy [(both true positive and true negative results / total)× 100]

  • odds ratio (true results / false results) preferably more than 50.

These are the tests to evaluate the validity (accuracy) of a test compared to a gold standard (Rothman et al. Citation2008).

Therefore, the authors’ conclusion is due to the confusion of reliability (precision) with validity (accuracy) and is, therefore, misleading.

Declaration of interest: The author reports no conflicts of interest.

References

  • Basehore PM, Pomerantz SC, Gentile M. 2014. Reliability and benefits of medical student peers in rating complex clinical skills. Med Teach 2014;36(5):409–414
  • Lawrence I, Kuei L. 1989. A concordance correlation coefficient to evaluate reproducibility. Biometrics 45:255–268
  • Rothman JK, Greenland S, Timothy LL. 2008. Modern epidemiology, 3rd ed. Baltimore, USA: Lippincott Williams & Wilkins

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.