Online Program Home
My Program

Abstract Details

Activity Number: 243
Type: Contributed
Date/Time: Monday, August 1, 2016 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract #318915
Title: MM Algorithms for Variance Components Models
Author(s): Liuyi Hu* and Hua Zhou and Jin Zhou and Kenneth Lange
Companies: North Carolina State University and University of California at Los Angeles and University of Arizona and University of California at Los Angeles
Keywords: minorization-maximization (MM) ; linear mixed model (LMM) ; max- imum likelihood estimation (MLE) ; matrix convexity ; multivariate response ; penalized estimation
Abstract:

Variance components estimation and mixed model analysis are central themes in statistics with applications in numerous scientific disciplines. Despite the best efforts of generations of statisticians and numerical analysts, maximum likelihood estimation and restricted maximum likelihood estimation of variance component models remain numerically challenging. Building on the minorization-maximization (MM) principle, this paper presents a novel iterative algorithm for variance components estimation. MM algorithm is trivial to implement and competitive on large data problems. The algorithm readily extends to more complicated problems such as linear mixed models, multivariate response models possibly with missing data, maximum a posteriori estimation, penalized estimation, and generalized estimating equations (GEE). We establish the global convergence of the MM algorithm to a KKT point and demonstrate, both numerically and theoretically, that it converges faster than the classical EM algorithm when the number of variance components is greater than two and all covariance matrices are positive definite.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

 
 
Copyright © American Statistical Association