Online Program Home
My Program

Abstract Details

Activity Number: 559 - Randomized Algorithms for Optimization Problems in Statistics
Type: Topic Contributed
Date/Time: Wednesday, July 31, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #305136
Title: Understanding the Acceleration Phenomenon via High-Resolution Differential Equations
Author(s): Weijie Su*
Companies: University of Pennsylvania
Keywords: Convex optimization; first-order method; Nesterov’s accelerated gradient methods; ordinary differential equation; Lyapunov function
Abstract:

Gradient-based optimization algorithms can be studied from the perspective of limiting ordinary differential equations (ODEs). Motivated by the fact that existing ODEs do not distinguish between two fundamentally different algorithms---Nesterov's accelerated gradient method for strongly convex functions (NAG-SC) and Polyak's heavy-ball method---we study an alternative limiting process that yields high-resolution ODEs. We show that these ODEs permit a general Lyapunov function framework for the analysis of convergence in both continuous and discrete time. We also show that these ODEs are more accurate surrogates for the underlying algorithms; in particular, they not only distinguish between NAG-SC and Polyak's heavy-ball method, but they allow the identification of a term that we refer to as "gradient correction" that is present in NAG-SC but not in the heavy-ball method and is responsible for the qualitative difference in convergence of the two methods. Finally, by modifying the high-resolution ODE of NAG-C, we obtain a family of new optimization methods that are shown to maintain the accelerated convergence rates of NAG-C for smooth convex functions.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program