Abstract:
|
In regression models when the number of predictors is much greater than the number of samples, accurately estimating regression coefficients is a challenge. The standard solution is to regularize the coefficients, using Bayesian priors or classical penalties. Another approach is to project these high-dimensional coefficients down to a much lower-dimensional basis, reducing the number of parameters to estimate. To this end, we have developed a nonparametric Bayesian reduced rank regression model (RRR) that does not require the rank of the linear subspace to be pre-specified. Our model diverges from the canonical RRR model in the prior distributions on the matrices comprising the regression coefficients; in particular one of these matrices is drawn from the Indian Buffet Process (IBP). The IBP prior enables the rank of the matrix to be estimated directly from the data. Because it is binary, we may interpret elements of this matrix as posterior probabilities of association between individual response and predictor variables. We show in simulations that our model effectively shares strength over correlated responses to improve statistical power to detect associations.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.