Abstract:
|
Variational inference is an approximate Bayesian estimation procedure in which recovering the full posterior distribution is traded for finding the Kullback-Leibler projection of the posterior onto a family of tractable approximations. This optimization problem frequently scales well with data size and model complexity, but the resulting approximation does not have the appealing interpretation of a posterior distribution, nor do standard Bayesian asymptotics such as posterior consistency or asymptotic normality apply. There is a growing body of research on properties of variational inference in particular models, but general asymptotic results are yet undeveloped. We will present a general approach for understanding the asymptotics of variational point estimators including consistency and weak convergence utilizing M-estimation theory. Our approach has the advantage that it can be applied to a variety of models, variational families, and "local" variational approximations that do not minimize KL divergence. We demonstrate this broad applicability by constructing valid large-sample confidence sets for variational estimators in a few diverse cases.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.