Online Program Home
My Program

Abstract Details

Activity Number: 411
Type: Topic Contributed
Date/Time: Tuesday, August 2, 2016 : 2:00 PM to 3:50 PM
Sponsor: International Society for Bayesian Analysis (ISBA)
Abstract #319501
Title: Automatic Variational Inference in Stan
Author(s): Alp Kucukelbir* and Dustin Tran and Rajesh Ranganath and Andrew Gelman and David Blei
Companies: and and Princeton and Columbia University and Columbia University
Keywords: variational inference ; approximate bayesian inference ; probabilistic programming
Abstract:

Variational inference is a scalable technique for approximate Bayesian inference. Deriving variational inference algorithms requires tedious model-specific calculations; this makes it difficult for non-experts to use. We propose an automatic variational inference algorithm, automatic differentiation variational inference (ADVI); we implement it in Stan (code available), a probabilistic programming system. In ADVI the user provides a Bayesian model and a dataset, nothing else. We make no conjugacy assumptions and support a broad class of models. The algorithm automatically determines an appropriate variational family and optimizes the variational objective. We compare ADVI to MCMC sampling across hierarchical generalized linear models, nonconjugate matrix factorization, and a mixture model. We train the mixture model on a quarter million images. With ADVI we can use variational inference on any model we write in Stan.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

 
 
Copyright © American Statistical Association