Online Program Home
My Program

Abstract Details

Activity Number: 559 - Randomized Algorithms for Optimization Problems in Statistics
Type: Topic Contributed
Date/Time: Wednesday, July 31, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #301803 Presentation
Title: Statistical Properties of Stochastic Gradient Descent
Author(s): Panagiotis Toulis and Jerry Chee*
Companies: University of Chicago Booth School of Business and University of Chicago
Keywords: stochastic gradients; iterative estimation; optimization; inference
Abstract:

Stochastic gradient descent (SGD) is remarkably multi-faceted: for machine learners it is a powerful optimization method, but for statisticians it is a method for iterative estimation. While a lot is known for optimization with SGD, little is known about its statistical properties. We will review recent results that include analytic formulas for the asymptotic covariance matrix of SGD-based estimators and a numerically stable variant of SGD with implicit updates. Together these results open up the possibility of doing principled statistical analysis with SGD, including classical inference and hypothesis testing. Specifically about inference, we present current work showing that with appropriate selection of the learning rate the confidence interval is simple and parameter-free. This is a unique and remarkable property of SGD, even compared to popular estimation methods favored by statisticians, such as maximum likelihood, highlighting the untapped potential of SGD for fast and principled estimation with large data sets.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program