Online Program Home
My Program

Abstract Details

Activity Number: 572 - Sparsity and Variable Selection in Posterior Inference
Type: Contributed
Date/Time: Wednesday, July 31, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #304464 Presentation
Title: A Fully-Bayesian Approach to Sparse Reduced-Rank Multivariate Regression
Author(s): Dunfu Yang* and Gyuhyeong Goh and Haiyan Wang
Companies: Kansas State University and Kansas State University and Kansas State University
Keywords: Fully-Bayesian; Low-rank; Multivariate regression; Rank reduction; Sparsity; Spike-and-slab prior

Sparse and low-rank matrix estimation plays a key role in recent multivariate regression analysis. The low-rank structure not only reveals hidden interrelations among the response variables but also improves prediction accuracy. It is also a necessary technique to eliminate irrelevant predictors. In high-dimensional multivariate regression, sparse reduced-rank regression (SRRR) provides an effective means to handle both the sparsity and the low-rank constraint of the coefficient matrix. Although there is extensive research on SRRR, statistical inference procedures for sparse and low-rank coefficient matrix are still limited. To fill this research gap, we develop a fully-Bayesian approach to SRRR using the notion of spike-and-slab priors. However, due to dimension changing problems, traditional MCMC computation such as Gibbs sampler and Metropolis–Hastings algorithm is inapplicable in our Bayesian framework. To address this issue, we introduce a new posterior computation procedure based on collapsed Gibbs sampler and Laplace approximation. A key feature of the proposed method is that unknown low-rank and sparsity can be automatically estimated by the proposed MCMC computation.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program