Online Program Home
My Program

Abstract Details

Activity Number: 396 - Savage Awards Session
Type: Topic Contributed
Date/Time: Tuesday, July 30, 2019 : 2:00 PM to 3:50 PM
Sponsor: International Society for Bayesian Analysis (ISBA)
Abstract #306438 Presentation
Title: Black Box Variational Inference
Author(s): Rajesh Ranganath*
Companies: NYU Courant Institute of Mathematical Science
Keywords: Posterior Inference; Scalable Compute
Abstract:

Probabilistic generative models are robust to noise, uncover unseen patterns, and make predictions about the future. Probabilistic generative models posit hidden structure to describe data. They have addressed problems in neuroscience, astrophysics, genetics, and medicine. The main computational challenge is computing the hidden structure given the data---posterior inference. For most models of interest, computing the posterior distribution requires approximations like variational inference. Classically, variational inference was feasible to deploy in only a small fraction of models. We develop black box variational inference. Black box variational inference is easy to deploy on a broad class of models and has already found use in neuroscience and healthcare. The ideas around black box variational inference also facilitate new kinds of variational methods such as hierarchical variational models. Hierarchical variational models improve the approximation quality of variational inference by building higher-fidelity approximations from coarser ones. Black box variational inference opens the doors to new models and better posterior approximations.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program