Abstract:
|
Classical Principal Component Analysis (PCA) is a dimension reduction technique that inherently assumes Gaussian distribution and produces inconsistent estimators when the dimension p is moderate to high. We are interested in generalizing the Sparse PCA, which has proven to be the remedy of the aforementioned inconsistency, to non-Gaussian data types. In high-dimensional applications, model parsimony is often desired. The Sparse Generalized PCA can also perform variable selection by group sparsity regularization. Missing values, prevalent in many big data applications, is incorporated in the model setup. With the challenges of dimension, non-Gaussianity and missing values come the task of the unconventional nonquadratic objective function, nonconvex rank and sparsity regularization and many more. To perform the simultaneous rank and variable selection, the proposed SG-PCA} methods include a set of three algorithms that departs from a Stiefel manifold optimization aspect, an alternative gradient approach with quadratic surrogate function and a two-step optimization perspective. Numerical experiments compares the proposed methods with comparison algorithms.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.