Online Program Home
My Program

Abstract Details

Activity Number: 178 - Novel Applications and Extensions of Dimension Reduction Methods
Type: Contributed
Date/Time: Monday, July 29, 2019 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #307015
Title: Sparse Generalized Principal Component Analysis: Algorithms and Their Applications
Author(s): Jianhao Zhang* and Yoonkyung Lee
Companies: Ohio State University and Ohio State University
Keywords: Exponential Family Data; Generalized PCA; Manifold; Nonconvex Optimization; Sparsity

Generalized principal component analysis (GPCA) is an extension of standard PCA for exponential family data. Using the generalized linear model framework, it allows an effective representation of a low dimensional structure underlying discrete data such as binary features and counts. As with PCA, interpretability and stability are desired for GPCA in high dimensional settings. We propose sparse GPCA by imposing sparsity on the loadings for enhanced interpretability and stability. The orthogonality and sparsity constraints on the loadings result in a non-smooth manifold optimization, which is still a challenging computational problem. Inspired by the recent advances in non-smooth manifold optimization such as manifold ADMM and manifold proximal gradient method, we develop computational algorithms for sparse GPCA and demonstrate their utility through a list of numerical experiments and application to real data.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program