Abstract:
|
We propose a novel Riemannian geometric framework for variational inference in Bayesian models based on the nonparametric Fisher-Rao metric on the manifold of probability density functions. Under the square-root transform representation, the manifold with the Fisher-Rao metric reduces to the unit Hilbert hypersphere with the standard L^2 metric. In contrast to existing approaches based on the Kullback-Leibler divergence, we approximate the posterior by a member of an appropriate class closest to the posterior with respect to the alpha-divergence. As a consequence, in comparison with existing methods, our procedure leads to a tighter lower bound on the marginal density of the data. Our procedure also leads to an upper bound on the marginal density, which cannot be obtained from approaches based on Kullback-Leibler divergence. We provide several examples that validate the proposed framework. In particular, we consider classification via Bayesian logistic regression on few data sets and show that the performance of our method is comparable to other classification approaches.
|