Abstract:
|
We describe ?exible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a deep feedforward neural networks( DFNN). E?cient computational methods for high dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but ?exible factor parametrization of the covariance matrix. We implement natural gradient methods for the optimization, exploitingthe factor structure of the variational covariance matrix in computation of the natural gradient. This leads to a regression and classi?cation method having a high predictive accuracy, and that is able to quantify the prediction uncertainty in a principled and convenient way. We also describe how to perform variable selection in our deep learning method. The proposed methods are illustrated in a wide range of simulated and real-data examples, and the results compare favourably to a state of the art ?exible regression and classi?cation method in the statistical literature, the Bayesian additive regression trees (BART) method.
|