Abstract:
|
In a regression problem, if the number of covariates is larger than the number of observations, one typically uses lasso or ridge regression to obtain a sparse solution. But these methods explore only linear relationships between the response variable and the covariates. To explore more richer class of models, one can search for a sparse solution in Reproducing Kernel Hilbert Space (RKHS). This can be done in a hierarchical Bayes setup using Relevance Vector Machines (RVM). Typically, the hyper parameters in RVM are estimated using Type 2 maximum likelihood approach. In this article, we assign a proper prior to them and estimate them using a Gibbs sampler. We also prove the geometric rate of convergence of the Gibbs sampler by establishing the drift and the minorization conditions. Further, the kernel parameters in RVM are estimated using an empirical Bayes approach. Lastly, the methodology developed in this article is illustrated using both simulated data and real life data examples.
|