|
Activity Number:
|
453
|
|
Type:
|
Invited
|
|
Date/Time:
|
Wednesday, August 5, 2009 : 10:30 AM to 12:20 PM
|
|
Sponsor:
|
IMS
|
| Abstract - #303164 |
|
Title:
|
Decision Trees and Gradient Boosting
|
|
Author(s):
|
Jerome H. Friedman*+
|
|
Companies:
|
Stanford University
|
|
Address:
|
Sequoia Hall, Stanford, CA, 94305,
|
|
Keywords:
|
classification ; regression ; data mining ; boosting ; decision trees
|
|
Abstract:
|
Boosted decision tree models have emerged as being among the most useful tools for predictive data mining (classification and regression). They are fast to compute allowing application to very large data sets. Their accuracy is competitive with the best customized problem specific approaches, while being fairly automatic to use (little tuning), and highly robust especially when applied to less than clean data. They also offer some interpretability of the resulting predictive model. This lecture will start with a brief description of (CART) decision trees, followed by an introduction to the basic concepts of gradient boosting. Issues specific to boosting decision trees are then discussed. Finally tools are presented for interpreting and visualizing these multiple additive regression tree (MART) models.
|
- The address information is for the authors that have a + after their name.
- Authors who are presenting talks have a * after their name.
Back to the full JSM 2009 program |