Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 169 - Advanced Bayesian Topics (Part 2)
Type: Contributed
Date/Time: Tuesday, August 10, 2021 : 10:00 AM to 11:50 AM
Sponsor: Section on Bayesian Statistical Science
Abstract #318030
Title: Approximate Bayesian Computation via Classification
Author(s): Yuexi Wang* and Veronika Rockova and Tetsuya Kaji
Companies: University of Chicago and University of Chicago and University of Chicago
Keywords: Approximate Bayesian computation; Asymptotics; Likelihood-free inference; Posterior concentration
Abstract:

Approximate Bayesian Computation (ABC) enables statistical inference in complex models whose likelihoods are difficult to calculate but easy to simulate from. ABC yields a kernel approximation to the marginal likelihood through an accept/reject mechanism which compares summary statistics of real and simulated data. Gutmann et al. (2018) obviate the need for summary statistics by deploying classification inside ABC. We elaborate on their proposal by automating the choice of summary statistics based on the Kullback-Leibler (KL) divergence comparing distributions of real and fake data. Theoretical results show that the rate at which such ABC posterior distribution concentrates on sets containing the true parameter depends on the approximation error of the classifier. With a properly scaled exponential kernel, asymptotic normality of posterior mean, as well as the posterior itself, can be justified. We demonstrate the usefulness of our approach on simulated and real data.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program