Online Program Home
My Program

Abstract Details

Activity Number: 572 - Sparsity and Variable Selection in Posterior Inference
Type: Contributed
Date/Time: Wednesday, July 31, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #306927 Presentation
Title: Revisiting High-Dimensional Bayesian Model Selection for Gaussian Regression
Author(s): Zikun Yang* and Andrew Womack
Companies: Indiana University Bloomington and Indiana University
Keywords: High-dimenonal; Linear regression; Model selection; Zellner-Siow; Poisson
Abstract:

Model selection for regression problems with an increasing number of covariates continues to be an important problem both theoretically and in applications. Model selection consistency and mean structure reconstruction depend on the interplay between the Bayes factor learning rate and the penalization on model complexity. In this work, we present results for the Zellner-Siow prior for regression coefficients paired with a Poisson prior for model complexity. We show that model selection consistency restricts the dimension of the true model from increasing too quickly. Further, we show that the additional contribution to the mean structure from new covariates must be large enough to overcome the complexity penalty. The average Bayes factors for different sets of models involves random variables over the choices of columns from the design matrix. We show that a large class these random variables have no moments asymptotically and need to be analyzed using stable laws. We derive the domain of attraction for these random variables and obtain conditions on the design matrix that provide for the control of false discoveries.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program