Online Program Home
My Program

Abstract Details

Activity Number: 305 - Bayesian Modeling and Variable Selection Methods
Type: Contributed
Date/Time: Tuesday, July 30, 2019 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Computing
Abstract #304607
Title: Implicit Regularization via Hadamard Product Parametrization in Linear Regression
Author(s): Peng Zhao* and Yun Yang and Qiao-chu He
Companies: Florida State University and University of Illinois Urbana-Champaign and Southern University of Science and Technology
Keywords: Implicit regularization; High dimensional regression; Early stopping; Gradient descent; Over-parametrization; Variable selection
Abstract:

We consider Hadamard product parametrization as a change-of-variable (over-parametrization) technique for solving least square problems in the context of linear regression. Despite the non-convexity and exponentially many saddle points induced by the change-of-variable, we show that under certain conditions, this over-parametrization leads to implicit regularization: if we directly apply gradient descent to the residual sum of squares with sufficiently small initial values, then under proper early stopping rule, the iterates converge to a nearly sparse rate-optimal solution with relatively better accuracy than explicit regularized approaches. In particular, the resulting estimator does not suffer from extra bias due to explicit penalties, and can achieve the parametric root-n rate (independent of the dimension) under proper conditions on the signal-to-noise ratio. We perform simulations to compare our methods with high dimensional linear regression with explicit regularizations. Our results illustrate advantages of using implicit regularization via gradient descent after over-parametrization in sparse vector estimation.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program