Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 385 - Biomarkers, Endpoint Validation and Other Topics
Type: Contributed
Date/Time: Thursday, August 12, 2021 : 12:00 PM to 1:50 PM
Sponsor: Biopharmaceutical Section
Abstract #318137
Title: A Generalized Linear Mixed Model Framework for Calculating Inter-Rater Reliability
Author(s): Jonathan D Mahnken and Katelyn A McKenzie*
Companies: The University of Kansas Medical Center and The University of Kansas Medical Center
Keywords: Cohen's kappa; agreement; diagnostic tests
Abstract:

Cohen’s kappa is used across many fields, such as medicine and the social sciences, to assess inter-rater reliability when the truth of a diagnostic test result is unknown. Recent works have demonstrated shortcomings among inter-rater reliability studies (PMID: 28693497). This follows primarily from selecting only two raters on which to extrapolate conclusions. The primary goal of this project is to generalize Cohen’s kappa to allow for multiple raters and account for their correlations. The use of a generalized linear mixed model (GLMM) provides a flexible framework upon which to achieve this goal. The expected probabilities of agreement for each pair of subject and rater can be calculated from the GLMM. Simulation “in silico” studies were completed to assess the proposed method. Our approach, which is easily implemented in standard statistical software, yields an estimate of Cohen’s kappa that simultaneously accounts for correlations among reviewers and allows for categorical and continuous covariates.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program