Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 75 - Invited EPoster Session II
Type: Invited
Date/Time: Sunday, August 7, 2022 : 9:35 PM to 10:30 PM
Sponsor: Section on Statistics in Epidemiology
Abstract #322781
Title: DPQL: A Lossless Distributed Algorithm for Generalized Linear Mixed Model with Application to Privacy-Preserving Hospital Profiling
Author(s): Chongliang Luo* and Md Nazmul Islam and Natalie E. Sheils and John Buresh and Martijn J. Schuemie and Jalpa Doshi and Rachel Werner and David Asch and Yong Chen
Companies: Washington University in St Louis and UnitedHealth Group and OptumLabs at UnitedHealth Group and OptumLabs at UnitedHealth Group and Janssen Research and Development and University of Pennsylvania and University of Pennsylvania and University of Pennsylvania and University of Pennsylvania
Keywords: Distributed Penalized Quasi Likelihood Algorithm; Federated Learning; Generalized Linear Mixed Model; Hospital Profiling; Privacy-preserving
Abstract:

Hospital profiling, the process that determines to what extent patient outcomes depend on the hospital, provides a quantitative comparison of healthcare providers based on their quality of care. To implement hospital profiling, the generalized linear mixed model (GLMM) is used to fit outcome models using clinical or administrative claims data. For better generalizability, data across multiple hospitals, databases, or networks are desired. However, due to privacy regulations and the computational complexity of GLMM, a distributed algorithm for hospital profiling is needed. Here, we develop a novel distributed Penalized Quasi Likelihood (dPQL) algorithm to fit GLMM when only aggregated data, rather than individual patient data, can be shared across hospitals. The proposed algorithm is lossless, i.e., it obtains identical results as if individual patient data were pooled from all hospitals. We apply the dPQL algorithm by ranking 929 hospitals for COVID-19 mortality or referral to hospice that have been previously studied.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program