Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 126 - Topics at the Frontier of Statistical Computing and Machine Learning
Type: Invited
Date/Time: Monday, August 8, 2022 : 10:30 AM to 12:20 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #319294
Title: Sparse Hamiltonian Flows (Or Bayesian Coresets Without All the Fuss)
Author(s): Trevor Campbell* and Naitong Chen and Zuheng Xu
Companies: UBC and University of British Columbia and University of British Columbia - Vancouver, BC
Keywords: Bayesian; coresets; Hamiltonian; flows; variational; inference
Abstract:

Bayesian inference provides a coherent approach to learning from data in complex models. However, algorithms for performing inference have not yet caught up to the deluge of data in modern applications. One approach---Bayesian coresets---involves replacing the large dataset with a small, weighted subset of data (a coreset) during inference. Although the methodology is sound in principle, efficiently constructing a coreset remains a significant challenge: current methods tend to be complicated to implement, slow, and require a secondary inference step. In this talk, I will introduce a new method---sparse Hamiltonian flows---that addresses all of these challenges. The method involves first subsampling the data uniformly, and then optimizing a Hamiltonian flow parametrized by coreset weights and including periodic momentum quasi-refreshment steps. I will present results demonstrating that the method enables an exponential compression of the dataset in representative models. Experiments demonstrate that sparse Hamiltonian flows provide accurate posterior approximations with significantly reduced runtime compared with competing dynamical-system-based inference methods.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program