Online Program Home
  My Program

Abstract Details

Activity Number: 16 - Recent Advances and Challenges in High-Dimensional Data Analysis
Type: Topic Contributed
Date/Time: Sunday, July 30, 2017 : 2:00 PM to 3:50 PM
Sponsor: IMS
Abstract #324202 View Presentation
Title: On the Asymptotic Performance of Bridge Estimators
Author(s): Arian Maleki* and Haolei Weng and Shuaiwen Wang
Companies: Columbia Univ and Columbia University and Columbia University
Keywords: linear regressio ; high-dimensional ; asymptotic ; lasso

We study the performance of l_q regularized least squares, also known as bridge estimators, for 0 < = q < = 2 under the asymptotic setting in which the number of observations grows proportionally with the number of predictors. In our analysis, the vector of regression coefficients is assumed to be sparse. Despite the non-convexity of bridge optimization for 0 < q< 1, they are still appealing for their closer proximity to the "ideal'' l_0 regularized least squares. In this talk, we analyze the properties of the global minimizer of the bridge estimators under the optimal tuning of the regularization parameter. Our goal is to answer the following questions: (i) Do non-convex regularizers outperform convex regularizers? (ii) Does q=1 outperform other convex optimization problems when the vector regression coefficient vector is sparse? We discuss both the predictive power and variable selection accuracy of these algorithms. If time permits, we also discuss algorithms that can provably reach the global minima of the non-convex problems in certain regimes.

This talk is based on a joint work with Haolei Weng, Shuaiwen Wang, and Le Zheng.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

Copyright © American Statistical Association