Online Program Home
  My Program

Abstract Details

Activity Number: 183 - SPEED: Bayesian Methods Student Awards
Type: Contributed
Date/Time: Monday, July 31, 2017 : 10:30 AM to 11:15 AM
Sponsor: Section on Bayesian Statistical Science
Abstract #325148
Title: A Geometric Variational Approach to Bayesian Inference
Author(s): Abhijoy Saha* and Karthik Bharath and Sebastian Kurtek
Companies: Department of Statistics, The Ohio State University and University of Nottingham and The Ohio State University
Keywords: Variational Bayes ; Alpha -divergence ; Fisher-Rao metric ; Bayesian logistic regression ; Riemannian geometry
Abstract:

We propose a novel Riemannian geometric framework for variational inference in Bayesian models based on the nonparametric Fisher-Rao metric on the manifold of probability density functions. Under the square-root transform representation, the manifold with the Fisher-Rao metric reduces to the unit Hilbert hypersphere with the standard L^2 metric. In contrast to existing approaches based on the Kullback-Leibler divergence, we approximate the posterior by a member of an appropriate class closest to the posterior with respect to the alpha -divergence. As a consequence, in comparison with existing methods, our procedure leads to a tighter lower bound on the marginal density of the data. Our procedure also leads to an upper bound on the marginal density, which cannot be obtained from approaches based on Kullback-Leibler divergence. We provide several examples that validate the proposed framework. In particular, we consider classi fication via Bayesian logistic regression on few data sets and show that the performance of our method is comparable to other classi fication approaches.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association