Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 2 - Introductory Overview Lecture: Scalable Bayesian Inference
Type: Invited
Date/Time: Monday, August 3, 2020 : 10:00 AM to 11:50 AM
Sponsor: JSM Partner Societies
Abstract #309644
Title: Scaling and Generalizing Approximate Bayesian Inference
Author(s): David Blei*
Companies: Columbia University
Keywords:
Abstract:

A core problem in statistics is to approximate difficult-to-compute probability distributions. This problem is especially important in Bayesian statistics, which frames all inferences as a calculation about a conditional distribution. Here I review and discuss innovations in variational inference (VI), a method a that uses optimization to approximate probability distributions. VI has been used in myriad applications of Bayesian statistics. It tends to be faster than more traditional methods, such as Markov chain Monte Carlo sampling.

After reviewing the basics, I will discuss some recent research on VI. I describe stochastic variational inference, an approximate inference algorithm for handling massive data sets, and demonstrate its application to probabilistic topic models of millions of articles. Then I discuss black box variational inference, a generic algorithm for approximating the posterior. Black box inference easily applies to many models but requires minimal mathematical work to derive. I demonstrate black box inference on deep exponential families---a method for Bayesian deep learning---and describe how it enables tools for probabilistic programming.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program