Online Program Home
My Program

Abstract Details

Activity Number: 412 - Data Science and Machine Learning Topics
Type: Contributed
Date/Time: Tuesday, July 30, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract #307076
Title: Gradient-Based Sparse Principal Component Analysis with Extensions to Online Learning
Author(s): Yixuan Qiu* and Jing Lei and Kathryn Roeder
Companies: Carnegie Mellon University and Carnegie Mellon University and Carnegie Mellon University
Keywords: sparse principal component analysis; dimensionality reduction; convex optimization; gradient method; online learning
Abstract:

Sparse principal component analysis (SPCA) is an important technique for dimensionality reduction of high-dimensional data. However, most existing SPCA algorithms are based on non-convex optimization, which provide little guarantee on the global convergence. SPCA algorithms based on a convex formulation, for example the Fantope projection and selection (FPS) model, overcome this difficulty, but are computationally expensive. In this work we study SPCA based on the convex FPS formulation, and propose a new algorithm that is computationally efficient and applicable to large and high-dimensional data sets. We also extend our algorithm to online learning problems, where data are obtained in a streaming fashion. The proposed algorithm is applied to high-dimensional genetic data for the detection of functional gene groups.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program