Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 354 - Multivariate Analysis and Graphical Models
Type: Contributed
Date/Time: Wednesday, August 5, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #309738
Title: Fused-Lasso Regularized Cholesky Factors of Large Nonstationary Covariance Matrices of Longitudinal Data
Author(s): Aramayis Dallakyan* and Mohsen Pourahmadi
Companies: Texas A&M University and Texas A&M University
Keywords: Covariance Estimation; Smoothed Cholesky; High-dimensional data; Penalized Likelihood
Abstract:

Smoothness of the subdiagonals of the Cholesky factor of large covariance matrices is closely related to the degrees of nonstationarity of autoregressive models for time series and longitudinal data. Heuristically, one expects for a nearly stationary (Toeplitz) covariance matrix the entries in each subdiagonal of the Cholesky factor of its inverse to be nearly the same in the sense that sum of absolute values of successive terms is small. Statistically such smoothness is achieved by regularizing each subdiagonal using fused-type lasso penalties. We rely on the standard Cholesky (in place of the more favored modified Cholesky) factor as the new parameters within a regularized normal likelihood setup which guarantees: (1) joint convexity of the likelihood function, (2) strict convexity of the likelihood function restricted to each subdiagonal even when $n< p$, and (3) positive-definiteness of the estimated covariance matrix. A block coordinate descent algorithm, where each block is a subdiagonal, is proposed and its convergence is established under mild conditions. Lack of decoupling of the penalized likelihood function into a sum of likelihoods involving individual subdiagona


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program