Online Program Home
  My Program

Abstract Details

Activity Number: 529 - SPEED: Machine Learning
Type: Contributed
Date/Time: Wednesday, August 2, 2017 : 10:30 AM to 11:15 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #325116
Title: Regression-Enhanced Random Forests
Author(s): Haozhe Zhang* and Dan Nettleton and Zhengyuan Zhu
Companies: Iowa State University and Iowa State University and Iowa State University
Keywords: Machine learning ; Prediction ; Lasso ; Random forests ; Extrapolation
Abstract:

Random forest (RF) methodology is one of the most popular machine learning techniques for prediction problems. In this article, we discuss some cases where random forests may suffer and propose a novel generalized RF method, namely regression-enhanced random forests (RERFs), that can improve on RFs by borrowing the strength of penalized parametric regression. The algorithm for constructing RERFs and selecting its tuning parameters is described. Both simulation study and real data examples show that RERFs have better predictive performance than RFs in important situations often encountered in practice. Moreover, RERFs may incorporate known relationships between the response and the predictors, and may give reliable predictions in extrapolation problems where predictions are required at points out of the domain of the training dataset. Strategies analogous to those described here can be used to improve other machine learning methods via combination with penalized parametric regression techniques.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association