Online Program Home
My Program

Abstract Details

Activity Number: 697
Type: Contributed
Date/Time: Thursday, August 4, 2016 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #320417
Title: Pathwise Coordinate Optimization for Nonconvex Sparse Learning: Algorithm and Theory
Author(s): Tuo Zhao* and Han Liu and Tong Zhang
Companies: The Johns Hopkins University and Princeton and Rutgers University
Keywords: Pathwise Coordinate Optimization ; Variable Selection ; Nonconvex Optimization ; High Dimensions ; Model-based Optimization ; Large-scale Optimization

The pathwise coordinate optimization is one of the most important computational frameworks for solving high dimensional nonconvex sparse learning problems. It differs from the classical coordinate optimization algorithms in three salient features: warm start initialization, active set updating, and strong rule for coordinate preselection. These three features grant superior empirical performance, but also pose significant challenge to theoretical analysis. To tackle this long lasting problem, we develop a new theory showing that these features play pivotal roles in guaranteeing its statistical and computational performance of the pathwise coordinate optimization framework. In particular, we analyze the existing methods for pathwise coordinate optimization and provide new theoretical insights into them. The obtained theory motivates the development of several modifications to improve the pathwise coordinate optimization framework, which guarantees linear convergence to a unique sparse local optimum with optimal statistical properties. This is the first result establishing the computational and statistical guarantees of the pathwise coordinate optimization in high dimensions.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

Copyright © American Statistical Association