Abstract Details
Activity Number:
|
346
|
Type:
|
Topic Contributed
|
Date/Time:
|
Tuesday, August 6, 2013 : 10:30 AM to 12:20 PM
|
Sponsor:
|
International Society for Bayesian Analysis (ISBA)
|
Abstract - #308891 |
Title:
|
Bayesian Shrinkage in High Dimensions
|
Author(s):
|
Anirban Bhattacharya*+
|
Companies:
|
Duke University
|
Keywords:
|
Bayesian ;
concentration ;
convergence rate ;
high-dimensional ;
regularization ;
shrinkage
|
Abstract:
|
Penalized regression methods, such as L1 regularization, are routinely used in high-dimensional applications, and there is a rich literature on optimality properties under sparsity assumptions. In the Bayesian paradigm, sparsity is routinely induced through two-component mixture priors having a probability mass at zero, but such priors encounter daunting computational problems in high dimensions. This has motivated an amazing variety of continuous shrinkage priors, which can be expressed as global-local scale mixtures of Gaussians, facilitating computation. In sharp contrast to the corresponding frequentist literature, very little is known about the properties of such priors. Focusing on a broad class of shrinkage priors, we provide precise results on prior and posterior concentration. We demonstrate that many commonly used shrinkage priors, including the Bayesian Lasso, are suboptimal in high-dimensional settings. A new class of Dirichlet-Laplace (DL) priors are proposed, which possess optimal concentration and lead to efficient posterior computation. Operating characteristics of the proposed prior is illustrated in a variety of sparse-recovery problems.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2013 program
|
2013 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Continuing Education program, please contact the Education Department.
The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Copyright © American Statistical Association.