Every week, we select a recently published Open Access article to feature. This week’s article is from Statistics in Medicine and assesses agreement for multiple raters and covariates.
The article’s abstract is given below, with the full article available to read here.
Simulating and estimating agreement in the presence of multiple raters and covariates. Statistics in Medicine. 2023; 1– 12. doi: 10.1002/sim.9694, .
Cohen’s and Fleiss’s kappa are popular estimators for assessing agreement among two and multiple raters, respectively, for a binary response. While additional methods have been developed to account for multiple raters and covariates, they are not always applicable, rarely used, and none simplify to Cohen’s kappa. Furthermore, there are no methods to simulate Bernoulli observations under the kappa agreement structure such that the developed methods could be adequately assessed. This manuscript overcomes these shortfalls. First, we developed a model-based estimator for kappa that accommodates multiple raters and covariates through a generalized linear mixed model and encompasses Cohen’s kappa as a special case. Second, we created a framework to simulate dependent Bernoulli observations that upholds all 2-tuple pair of rater’s kappa agreement structure and includes covariates. We used this framework to assess our method when kappa was nonzero. Simulations showed that Cohen’s and Fleiss’s kappa estimates were inflated unlike our model-based kappa. We analyzed an Alzheimer’s disease neuroimaging study and the classic cervical cancer pathology study. The proposed model-based kappa and advancement in simulation methodology demonstrates that the popular approaches of Cohen’s and Fleiss’s kappa are poised to yield invalid conclusions while our work overcomes shortfalls, leading to improved inferences.More Details