Online Program Home
My Program

Abstract Details

Activity Number: 248
Type: Contributed
Date/Time: Monday, August 1, 2016 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #318795
Title: High-Dimensional Linear Regression via the R2-D2 Shrinkage Prior
Author(s): Yan Zhang* and Brian J. Reich and Howard Bondell
Companies: North Carolina State University and North Carolina State University and North Carolina State University
Keywords: global-local shrinkage ; marginal density ; Kullback-Leibler efficiency ; linear regression ; Horseshoe ; Dirichlet-Laplace

We propose a new class of priors for linear regression, the R-square induced Dirichlet Decomposition (R2-D2) prior. The prior is induced by a Beta prior on the coefficient of determination, and then the total prior variance of the regression coefficients is decomposed through a Dirichlet prior. We demonstrate both theoretically and empirically the advantages of the R2-D2 prior over a number of common shrinkage priors, including the Horseshoe, Horseshoe+, and Dirichlet-Laplace priors. The R2-D2 prior possesses the fastest concentration rate around zero and heaviest tails among these common shrinkage priors, which is established based on its marginal density, a Meijer G-function. We show that its Bayes estimator converges to the truth at a Kullback-Leibler super-efficient rate, attaining a sharper information theoretic bound than existing common shrinkage priors. We also demonstrate that the R2-D2 prior yields a consistent posterior. The R2-D2 prior permits straightforward Gibbs sampling and thus enjoys computational tractability. The proposed prior is further investigated in a mouse gene expression application.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

Copyright © American Statistical Association