Online Program Home
My Program

Abstract Details

Activity Number: 427
Type: Contributed
Date/Time: Tuesday, August 2, 2016 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract #319437
Title: MCMC Diagnostics Based on Kullback Leibler Divergence and Smoothing Methods
Author(s): Anand Dixit* and Vivekananda Roy
Companies: Iowa State University and Iowa State University
Keywords: Adaptive kernal density estimation ; Kullback Leibler Divergence ; Markov Chain Monte Carlo
Abstract:

In order to simulate observations from an analytically intractable probability distribution (target distribution), researchers commonly utilize Markov Chain Monte Carlo (MCMC) samplers. Several MCMC convergence diagnostic tools have been proposed in the literature which help in checking if the sample drawn by running an MCMC algorithm is indeed from the target distribution. In this article we propose two MCMC convergence diagnostic tools based on Kullback Leibler divergence and smoothing methods. In the case of multimodal target distribution, if the MCMC sampler gets stuck in one of the modes then the current MCMC convergence diagnostic tools may falsely detect convergence. But since one of our tools also utilizes the target distribution, it can correctly indicate divergence in such situations. We also propose a graphical visualization tool that complements the application of one of the tools. The usefulness of the tools proposed in this article is illustrated using a multimodal, mixture of bivariate normal target distributions and a Bayesian logit model example.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

 
 
Copyright © American Statistical Association