Online Program Home
My Program

Abstract Details

Activity Number: 529 - Regression Trees and Random Forests
Type: Contributed
Date/Time: Wednesday, August 1, 2018 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #329476 Presentation
Title: Locally Linear Forests: Leveraging Smoothness with Random Forests
Author(s): Rina Friedberg* and Julie Tibshirani and Susan Athey and Stefan Wager
Companies: Stanford University and Palantir Technologies and Stanford University and Stanford University
Keywords: Random Forests; Hetereogeneous Causal Inference; Local Linear Regression
Abstract:

Random forests (Breiman 2001) are a powerful and ubiquitous predictive method, increasingly considered as an adaptive approach for non-parametric statistical estimation. A key limitation of Breiman's forests, however, is that they cannot take advantage of smoothness properties of the underlying signal. This leads to unstable estimates and noticeably suboptimal prediction in the presence of strong linear effects. Local linear regression can excel in this setting, but traditionally employs non-adaptive kernels and is subject to an acute curse of dimensionality. Drawing on the strengths of these techniques, we introduce locally linear forests, which use random forests to motivate a data-adaptive kernel that can then be plugged into a locally weighted regression. In our experiments, we find that locally linear forests improve upon traditional regression forests in the presence of strong, smooth effects; we also show how assuming the presence of smoothness can give powerful asymptotic convergence results. Last, we highlight our procedure's usefulness in the burgeoning field of heterogeneous causal inference, where it presents a compelling new theoretical and practical development.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program