Abstract:
|
Boosting is an approach to statistical learning based on the idea of creating a highly accurate prediction rule by combining many relatively weak and inaccurate rules. Boosting was a topic of great interest to Leo Breiman, with close connections to his earlier work on bagging, and later work on random forests. All of these ensemble methods have enjoyed very considerable empirical success. Moreover, a remarkably rich theory evolved over the years to try to understand how and why these methods work, and under what conditions. AdaBoost, the first practical boosting algorithm, has been a particular focus of study and often controversy for the mystery and paradox it seems to present with regard to this question. Attempts to understand and "explain" AdaBoost as a learning method have included: bias-variance theory; Vapnik-Chervonenkis theory; the margins theory; AdaBoost as a loss-minimization algorithm (possibly implicitly regularized); and AdaBoost as a universally consistent method. This talk will give a high-level review and comparison of some of these, with particular emphasis on Breiman's many momentous contributions in this area.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.