Abstract:
|
In the past decade, many Bayesian shrinkage models have been developed for linear regression problems where the number of covariates p is large. Computing the intractable posterior are often done with three-block Gibbs samplers (3BG). An alternative computing tool is the state of the art Hamiltonian Monte Carlo (HMC) method, which can be easily implemented in the Stan software. However, we found both methods to be inefficient and often impractical for large p problems. In this paper, we propose two-block Gibbs samplers (2BG) for the Bayesian group lasso, the Bayesian sparse group lasso and the Bayesian fused lasso models. We demonstrate with simulated and real data sets that the Markov chains underlying the 2BG converge much faster than that of the 3BG, and no worse than that of the HMC. At the same time, the computing cost of the 2BG per iteration is as low as that of the 3BG, and can be several orders of magnitude lower than that of the HMC. As a result, the newly proposed 2BG is the only practical computing solution to do Bayesian shrinkage analysis for datasets with large p. Further, we provide theoretical justifications for the superior performance of the 2BG.
|