Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 560 - Latent Space Modeling and Dimensionality Reduction
Type: Contributed
Date/Time: Thursday, August 11, 2022 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #323592
Title: Direction Penalized Principal Component Analysis
Author(s): Youhong Lee* and Alex Shkolnik
Companies: University of California, Santa Barbara and University of California, Santa Barbara
Keywords: direction penalized principal component analysis; high-dimensional statistics; shrinkage estimation; spiked covariance model
Abstract:

We propose a regularization method called direction penalized principal component analysis (dPCA). This approach penalizes the first principal component, i.e., the direction of maximum variance of the data, for deviations away from some target direction. While the latter vector has an obvious interpretation in terms of a Bayesian prior, our main contributions lay elsewhere. In particular, we derive an optimal penalty parameter that, for any target, always reduces the asymptotic L2-loss function relative to that of the raw principal component. The optimal penalty parameter is determined solely from the data and an iterative algorithm efficiently computes the dPCA estimator. We prove our results by adopting a high-dimension, low sample size framework that is increasingly relevant for modern applications. To shed some insight into the dPCA estimator, we develop interesting connections to Ledoit-Wolf constant correlation shrinkage as well a recently proposed James-Stein estimator for the first principal component. We demonstrate the performance of dPCA by benchmarking against both of these estimators.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program