The central computational problem of Bayesian statistics is posterior inference, the problem of approximating the conditional distribution of latent variables given observations. Approximate posterior inference algorithms have revolutionized Bayesian statistics, revealing its potential as a general-purpose language for data analysis.
Bayesian statistics, however, has not yet reached this potential. First, statisticians regularly encounter massive data, but existing approximate inference algorithms do not scale well. Second, most approximate inference algorithms must be tailored to the specific model at hand. This requires significant model-specific analysis, which precludes us from easily exploring a variety of models.
We have addressed these limitations. First, stochastic variational inference is an approximate inference algorithm for handling massive data sets. It opens the door to scalable Bayesian computation for modern data analysis. Second, Black box inference is a generic algorithm for approximating the posterior. We can apply it to many models with little model-specific derivation and few restrictions on their properties.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.