Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 74 - Invited E-Poster Session I
Type: Invited
Date/Time: Sunday, August 7, 2022 : 8:30 PM to 9:25 PM
Sponsor: Section for Statistical Programmers and Analysts
Abstract #322731
Title: Pathfinder: Parallel Quasi-Newton Variational Inference
Author(s): Lu Zhang* and Bob Carpenter and Aki Vehtari and Andrew Gelman
Companies: Columbia University and Flatiron Institute and Aalto University and Columbia University
Keywords: Variational Inference; quasi-Newton optimization; Laplace approximation; importance resampling
Abstract:

We introduce Pathfinder, a variational method for approximately sampling from differentiable log densities. Starting from a random initialization, Pathfinder locates normal approximations to the target density along a quasi-Newton optimization path, with local covariance estimated using the inverse Hessian estimates produced by the optimizer. Pathfinder returns draws from the approximation with the lowest estimated Kullback-Leibler (KL) divergence to the true posterior. We evaluate Pathfinder on a wide range of posterior distributions, demonstrating that its approximate draws are better than those from automatic differentiation variational inference and comparable to those produced by short chains of dynamic Hamiltonian Monte Carlo, as measured by 1-Wasserstein distance. Importance resampling over multiple runs of Pathfinder improves the diversity of approximate draws. The Monte Carlo KL-divergence estimates are embarrassingly parallelizable in the core Pathfinder algorithm, as are multiple runs in the resampling version, further increasing Pathfinder's speed advantage with multiple cores.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program