Online Program Home
My Program

Abstract Details

Activity Number: 367
Type: Contributed
Date/Time: Tuesday, August 2, 2016 : 10:30 AM to 12:20 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #318985
Title: Bayesian Variable Selection by Cross Validation, DIC, and Marginal Likelihood: A Comparative Study
Author(s): Arnab Maity* and Sanjib Basu and SANTU GHOSH
Companies: Northern Illinois University and Northern Illinois University and Georgia Regents University
Keywords: LPML ; DIC ; HPM ; Variable Selection

Various popular Bayesian model selection criteria, such as Log Pseudo Marginal Likelihood (LPML) based on the conditional predictive ordinates (CPO), are based on the general notion of one-deleted cross-validation. The Deviance Information Criterion (DIC) is another popular predictive model selection criterion. These criteria can often be estimated from Markov chain samples with reasonable ease. In this article, via extensive simulation studies for linear, logistic, and survival models, we show that, LPML and DIC have poor performance in selecting the data generating model. The poor performance persists even in the setting of large number of observations. We provide theoretical explanation of this poor performance in the context of linear regression models. We further find that the Highest Posterior Model (HPM) or highest marginal likelihood approach to Bayesian variable selection performs substantially better.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

Copyright © American Statistical Association