Online Program Home
My Program

Abstract Details

Activity Number: 98 - New Developments in Bayesian Additive Regression Trees
Type: Invited
Date/Time: Monday, July 30, 2018 : 8:30 AM to 10:20 AM
Sponsor: Section on Bayesian Statistical Science
Abstract #326489 Presentation
Title: Bayesian Regression Tree Ensembles That Adapt to Smoothness and Sparsity
Author(s): Antonio Ricardo Linero* and Yun Yang
Companies: Florida State University and Florida State University
Keywords: Nonparametric regression; High dimensional; Bayesian nonparametrics; Posterior consistency; Decision trees; Boosting
Abstract:

Ensembles of decision trees are a useful tool for obtaining for obtaining flexible estimates of regression functions. Examples of these methods include gradient boosted decision trees, random forests, and Bayesian CART. Two potential shortcomings of tree ensembles are their lack of smoothness and vulnerability to the curse of dimensionality. We show that these issues can be overcome by instead considering sparsity inducing soft decision trees in which the decisions are treated as probabilistic. We implement this in the context of the Bayesian additive regression trees framework, and illustrate its promising performance through testing on benchmark datasets. We provide strong theoretical support for our methodology by showing that the posterior distribution concentrates at the minimax rate (up-to a logarithmic factor) for sparse functions and functions with additive structures in the high-dimensional regime where the dimensionality of the covariate space is allowed to grow near exponentially in the sample size. Our method also adapts to the unknown smoothness and sparsity levels, and can be implemented by making minimal modifications to existing BART algorithms.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program