JSM 2015 Preliminary Program

Online Program Home
My Program

Abstract Details

Activity Number: 221
Type: Invited
Date/Time: Monday, August 10, 2015 : 2:00 PM to 3:50 PM
Sponsor: Memorial
Abstract #317916
Title: Explaining AdaBoost
Author(s): Robert Schapire*
Companies: Microsoft Research/Princeton University
Keywords:
Abstract:

Boosting is an approach to statistical learning based on the idea of creating a highly accurate prediction rule by combining many relatively weak and inaccurate rules. Boosting was a topic of great interest to Leo Breiman, with close connections to his earlier work on bagging, and later work on random forests. All of these ensemble methods have enjoyed very considerable empirical success. Moreover, a remarkably rich theory evolved over the years to try to understand how and why these methods work, and under what conditions. AdaBoost, the first practical boosting algorithm, has been a particular focus of study and often controversy for the mystery and paradox it seems to present with regard to this question. Attempts to understand and "explain" AdaBoost as a learning method have included: bias-variance theory; Vapnik-Chervonenkis theory; the margins theory; AdaBoost as a loss-minimization algorithm (possibly implicitly regularized); and AdaBoost as a universally consistent method. This talk will give a high-level review and comparison of some of these, with particular emphasis on Breiman's many momentous contributions in this area.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2015 program





For program information, contact the JSM Registration Department or phone (888) 231-3473.

For Professional Development information, contact the Education Department.

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

2015 JSM Online Program Home