JSM 2013 Home
Online Program Home
My Program

Abstract Details

Activity Number: 245
Type: Contributed
Date/Time: Monday, August 5, 2013 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract - #310060
Title: Boosting with Fully Grown Trees
Author(s): J. Brian Gray and Jie Xu*+
Companies: University of Alabama and University of Alabama
Keywords: bagging ; boosting ; machine learning ; predictive modeling ; random forests
Abstract:

Breiman (2001) pointed out that the performance of an ensemble model depends on the strengths of the individual classifiers in the ensemble and the correlations (diversity) among them. The AdaBoost method uses weak learners, typically stumps or trimmed trees, as classifiers. The strengths of individual trees in AdaBoost, though, are low, which limits the potential of improving the ensemble performance. This article proposes a method to improve the strengths of the individual classifiers by using fully-grown trees that are fit on weighted resamples of the training data, which are then combined using the AdaBoost method. The performance of the new method compares favorably to random forests and AdaBoost.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2013 program




2013 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

ASA Meetings Department  •  732 North Washington Street, Alexandria, VA 22314  •  (703) 684-1221  •  meetings@amstat.org
Copyright © American Statistical Association.