Abstract:
|
Sparse and low-rank matrix estimation plays a key role in recent multivariate regression analysis. The low-rank structure not only reveals hidden interrelations among the response variables but also improves prediction accuracy. It is also a necessary technique to eliminate irrelevant predictors. In high-dimensional multivariate regression, sparse reduced-rank regression (SRRR) provides an effective means to handle both the sparsity and the low-rank constraint of the coefficient matrix. Although there is extensive research on SRRR, statistical inference procedures for sparse and low-rank coefficient matrix are still limited. To fill this research gap, we develop a fully-Bayesian approach to SRRR using the notion of spike-and-slab priors. However, due to dimension changing problems, traditional MCMC computation such as Gibbs sampler and Metropolis–Hastings algorithm is inapplicable in our Bayesian framework. To address this issue, we introduce a new posterior computation procedure based on collapsed Gibbs sampler and Laplace approximation. A key feature of the proposed method is that unknown low-rank and sparsity can be automatically estimated by the proposed MCMC computation.
|