Online Program Home
My Program

Abstract Details

Activity Number: 434 - SPEED: Classification and Data Science
Type: Contributed
Date/Time: Tuesday, July 31, 2018 : 2:00 PM to 2:45 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #332990
Title: A Modified Approach to Component-Wise Gradient Boosting for High-Dimensional Regression Models
Author(s): Brandon Butcher* and Brian J. Smith
Companies: University of Iowa and University of Iowa
Keywords: Gradient Boosting; Variable Selection; Cancer; High Dimensional; Machine Learning; Data Science
Abstract:

Penalized regression methods such as the lasso are a popular statistical method for analyzing high-dimensional data that arise in areas such as genetics and biomedical research in general. An alternative approach motivated by Friedman's gradient boosting machine framework was developed by Buhlmann and Yu (2003) and Buhlmann (2006), called Component-Wise Gradient Boosting (CWGB). We propose a modification to their method which we call Iteratively Re-estimated Gradient Boosting (IRGB). Our modification uses all the currently selected predictors in fitting the linear least squares base-learner on the current (pseudo-)residual, rather than only the selected predictor in the current boosting iteration. A high-dimensional simulation study adapted from Hastie et al (2007) indicates that IRGB provides similar or improved prediction performance over existing methods and tends to select fewer spurious predictors along with the true sparse regression model. Additionally, IRGB takes an alternative path through the parameter space and tends to require fewer boosting iterations than CWGB. IRGB also provides a unified framework for prediction problems involving mandatory and optional predictors.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program