Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 306 - SPEED: SPAAC SESSION II
Type: Topic-Contributed
Date/Time: Wednesday, August 11, 2021 : 3:30 PM to 5:20 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #318214
Title: Pathfinder: A parallel quasi-Newton algorithm for reaching regions of high probability mass
Author(s): Lu Zhang* and Bob Carpenter and Aki Vehtari and Andrew Gelman
Companies: Columbia University and Flatiron Institute and Aalto University and Columbia University
Keywords: Quasi-Newton Optimization; Laplace Approximation; Variational Inference; Markov chain Monte Carlo; Importance Resampling; Wasserstein Distance
Abstract:

We introduce Pathfinder, an approximate method for sampling from distributions with differentiable log densities. Starting from a random initialization, Pathfinder locates normal approximations to the target density along a quasi-Newton optimization path, with local covariance estimated using the inverse Hessian estimates produced by the optimizer. The approximation with the lowest Kullback-Leibler divergence to the true posterior is selected. Importance resampling over multiple runs of Pathfinder improves the diversity of approximate draws, providing a measure of robustness to optimization failures on plateaus or in minor modes. Experiments on a wide range of posterior distributions demonstrate that the approximate draws from Pathfinder range from slightly worse to much better than those from automatic differentiation variational inference or samples from the adaptation phase of dynamic Hamiltonian Monte Carlo, as measured by 1-Wasserstein distance, while Pathfinder requires one to three orders of magnitude fewer log density and gradient evaluations, and its implementation can be further accelerated through parallelization.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program