Abstract:
|
Bayesian additive regression trees (BART) are a popular method for inducing a nonparametric prior over regression functions, and their existing incarnations have seen application in a wide array of applied settings. BART models are appealing due to their flexibility, modularity and computational tractability. In this talk I will describe a number of extensions to existing ``vanilla'' BART models. These include loglinear models for counts, classification and heteroscedastic regression; models where the regression function parameterizes the distribution of a latent variable; and models where latent variables appear as inputs to the regression function itself.
|