Online Program Home
My Program

Abstract Details

Activity Number: 220 - Uncertainty Quantification for Stochastic Optimization Methods in Machine Learning
Type: Invited
Date/Time: Monday, July 29, 2019 : 2:00 PM to 3:50 PM
Sponsor: IMS
Abstract #300067 Presentation 1 Presentation 2
Title: Data-Adaptive Learning Rate Selection for Stochastic Gradient Descent Using Convergence Diagnostic
Author(s): Matteo Sordello* and Weijie Su
Companies: University of Pennsylvania and University of Pennsylvania
Keywords: learning rate; adaptive; phase transition

We propose a novel diagnostic method to detect the phase change, from transient to stationary, in Stochastic Gradient Descent (SGD) with constant learning rate. It combines ideas from the pflug diagnostic introduced by Chee and Toulis (2018) and the splitting proposed in Su and Zhu (2018). We use this diagnostic method in a SGD loop where, every time stationarity is detected, the learning rate is decreased by a factor gamma. We prove some theoretical guarantee for the asymptotic validity of this procedure and show through simulations that it improves on several existing optimization techniques. In particular, it allows for a less precise tuning for the learning rate, which is usually of critical importance in iterative methods.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program