18
Views
1
CrossRef citations to date
0
Altmetric
Original Article

Agreement or Association: Choosing a Measure of Reliability for Nominal Data in the 2 × 2 Case—A Comparison of Phi, Kappa, and G

Pages 915-920 | Published online: 03 Jul 2009
 

Abstract

This research note compares the ϕ, kappa, and G measures of association and agreement used to estimate the reliability of drug and alcohol self-report data in the 2 × 2 case. Attention is focused upon the problems encountered in using these measures when skewed marginal distributions are found in a test–retest situation. The G index of agreement was found to be a stable estimator, unaffected by skewed marginals, and equilivalent to kappa and ϕ when marginal distributions are equal.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.