Online Program Home
  My Program

Abstract Details

Activity Number: 519 - Sparse Statistical Learning
Type: Contributed
Date/Time: Wednesday, August 2, 2017 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #322457 View Presentation
Title: Decorrelation of Covariates for High-Dimensional Sparse Regression
Author(s): Yuan Ke* and Jianqing Fan and Kaizheng Wang
Companies: Princeton University and Princeton University and Princeton University
Keywords: High dimension ; Model selection consistency ; Correlated covariates ; Factor model ; Regularized M estimator
Abstract:

This paper studies the model selection consistency for high dimensional sparse regression with correlated covariates for a general class of regularized $M$ estimators. The commonly-used model selection methods fail to consistently recover the true model when the covariates are not weakly correlated. This paper proposes a consistent model selection strategy named Factor Adjusted Decorrelation (FAD) for high dimensional sparse regression when the covariate dependence can be reduced through factor models. By separating the latent factors from idiosyncratic components, we transform the problem from model selection with highly correlated covariates to that with weakly correlated variables. We show that FAD can achieve model selection consistency as well as optimal rates of convergence under mild conditions. Numerical studies show FAD has nice finite sample performance in terms of both models selection and out-of-sample prediction. Moreover, FAD is a flexible method in a sense that it pays no price for weakly correlated and uncorrelated cases. The proposed method is applicable to a wide range of high dimensional sparse regression.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association