Abstract:
|
Nonlinear kernel machine regression models are popularly used in statistics and machine learning as better predictive models with added flexibility in the regression or classification function estimation. However, variable selection for kernel regression models remains challenging. In kernel machine models we map from the covariate space to the kernel space and develop our model in the kernel space. Therefore, unlike the linear regression setting, there is no clear concept of an effect size for the regression coefficients. In this paper, we develop a novel framework that provides us an analog of the effect size of each explanatory variable for Bayesian kernel regression and classification models. Our methodology involves a hierarchical Bayesian Kernel Regression model based on the random Fourier expansion. The proposed model represents a computationally efficient class of Bayesian approximate kernel regression models for nonlinear regression when the response is a correlated vector. The usefulness of our model is demonstrated by several simulation studies and two real applications.
|