Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 497 - Variable Selection
Type: Contributed
Date/Time: Thursday, August 6, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #313962
Title: Structure Adaptive Lasso
Author(s): Sandipan Pramanik* and Xianyang Zhang
Companies: Texas A&M University and Texas A&M University
Keywords: Adaptive Lasso; External covariates; Penalized regression; Feature selection
Abstract:

Lasso is of fundamental importance in high-dimensional statistics and has been routinely used to regress a response on a high-dimensional set of predictors. In many scientific applications, there exists external information that encodes the predictive power and sparsity structure of the predictors. We develop a new method, called the Structure Adaptive Lasso (SA-Lasso), to incorporate these potentially useful side information into a penalized regression. The basic idea is to translate the external information into different penalization strengths for the regression coefficients. We study the risk properties of the resulting estimator. In particular, we generalize the state evolution framework recently introduced for the analysis of the approximate message-passing algorithm to the SA-Lasso setting. We show that the finite sample risk of the SA-Lasso estimator is consistent with the theoretical risk predicted by the state evolution equation. Our theory suggests that the SA-Lasso with an informative group or covariate structure can significantly outperform the Lasso, the Adaptive Lasso, and the Sparse Group Lasso. This is further confirmed through synthetic and real data analyses.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program