774
Views
9
CrossRef citations to date
0
Altmetric
Original Articles

Statistical inference of Gwet’s AC1 coefficient for multiple raters and binary outcomes

Pages 3564-3572 | Received 26 Jun 2019, Accepted 18 Dec 2019, Published online: 16 Jan 2020
 

Abstract

Cohen’s kappa and intraclass kappa are widely used for assessing the degree of agreement between two raters with binary outcomes. However, many authors have pointed out its paradoxical behavior, that comes from the dependence on the prevalence of a trait under study. To overcome the limitation, Gwet (Citation2008) proposed an alternative and more stable agreement coefficient referred to as the AC1. In this paper, we discuss a likelihood-based inference of the AC1 in the case of multiple raters and binary outcomes. Construction of confidence intervals is mainly discussed. In addition, hypothesis testing, sample size estimation, and the method of assessing the effect of subject covariates on agreement are also presented. The performance of the estimator of AC1 and its confidence intervals are investigated in a simulation study, and an example is presented.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,069.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.