Online Program Home
My Program

Abstract Details

Activity Number: 177 - Big Data and Computationally Intensive Methods
Type: Contributed
Date/Time: Monday, July 29, 2019 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Computing
Abstract #303061 Presentation
Title: Damped Anderson Acceleration with Restarts and Monotonicity Control for Accelerating EM and EM-Like Algorithms
Author(s): Nicholas Henderson* and Ravi Varadhan
Companies: Johns Hopkins University and Johns Hopkins University
Keywords: algorithm restarts; quasi-Newton; acceleration; MM algorithm
Abstract:

The expectation-maximization (EM) algorithm is a well-known iterative method for computing maximum likelihood estimates in a variety of statistical problems. Despite its numerous advantages, a main drawback of the EM algorithm is its frequently observed slow convergence which often hinders the application of EM algorithms in high-dimensional problems or in other complex settings. To address the need for more rapidly convergent EM algorithms, we describe a new class of acceleration schemes that build on the Anderson acceleration technique for speeding fixed-point iterations. Our approach is effective at greatly accelerating the convergence of EM algorithms and is automatically scalable to high dimensional settings. Through the introduction of periodic algorithm restarts and a damping factor, our acceleration scheme provides faster and more robust convergence when compared to un-modified Anderson acceleration, while also improving global convergence. Crucially, our method works as an ``off-the-shelf'' method in that it may be directly used to accelerate any EM algorithm without relying on the use of any model-specific features or insights.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program