Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 261 - High-Dimensional Statistical Inference Meets Large-Scale Optimization
Type: Topic Contributed
Date/Time: Tuesday, August 4, 2020 : 1:00 PM to 2:50 PM
Sponsor: IMS
Abstract #314113
Title: Statistical Learning with Stochastic Gradient Flow
Author(s): Edgar Dobriban*
Companies: University of Pennsylvania
Keywords:
Abstract:

We study the implicit regularization of mini-batch stochastic gradient descent, when applied to the fundamental problem of least squares regression. We leverage a continuous-time stochastic differential equation having the same moments as stochastic gradient descent, which we call stochastic gradient flow. We give a bound on the excess risk of stochastic gradient flow at time t, over ridge regression with tuning parameter lambda = 1/t. The bound may be computed from explicit constants (eg, the mini-batch size, step size, number of iterations), revealing precisely how these quantities drive the excess risk. Numerical examples show the bound can be small, indicating a tight relationship between the two estimators. We give a similar result relating the coefficients of stochastic gradient flow and ridge. These results hold under no conditions on the data matrix X, and across the entire optimization path (not just at convergence).


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program