Abstract:
|
The majority of Bayesian variable selection methods/algorithms for linear regression have focused on normal errors, which is a venerable problem in its own right, when the number of variables exceeds the sample size. Since estimates obtained under the normality assumption can be sensitive to outliers, robustifying the error distribution may be of interest, especially in high dimensions, when standard model diagnostics do not work well. The Bayesian variable selection approach can handle an unknown degree of sparsity by placing a prior on the inclusion probability of variables. In this work we allow additional flexibility by letting the likelihood depend on an unknown degree of tail heaviness. In particular, we develop Bayesian models with hyperbolic error distributions that incorporate variable selection, and compare them with Bayesian models with Student-t errors.
|