Abstract:
|
In this talk, we revisit the classic sparse principal component analysis (SPCA) proposed by Zou, Hastie, and Tibshirani (2006). After a decade, it is still an open question to efficiently solve the challenging nonsmooth manifold optimization problem of SPCA with provable guarantees. To close this gap, we introduce a novel alternating Manifold gradient descent and proximal gradient descent, which solves the SPCA with provable convergence guarantees. In the end, we will demonstrate the numerical properties of the proposed nonconvex algorithms in both simulation studies and real applications.
|