Abstract Details
Activity Number:
|
188
|
Type:
|
Contributed
|
Date/Time:
|
Monday, August 5, 2013 : 10:30 AM to 12:20 PM
|
Sponsor:
|
Section on Nonparametric Statistics
|
Abstract - #308605 |
Title:
|
The Convergence Rate of Majority Vote Under Exchangeability
|
Author(s):
|
Miles Lopes*+
|
Companies:
|
UC Berkeley
|
Keywords:
|
random forests ;
bagging ;
exchangeability ;
ensemble learning ;
majority vote
|
Abstract:
|
Majority vote plays a fundamental role in many applications of statistics, such as ensemble classifiers, crowdsourcing, and elections. When using majority vote as a prediction rule, it is of basic interest to ask "How many votes are needed to obtain a reliable prediction?" In the context of binary classification with random forests or bagging, we give a precise answer. If err_t denotes the test error achieved by the majority vote of t>1 classifiers (conditionally on a fixed set of training data), and err* denotes its nominal limiting value, then under basic regularity conditions, we show that err_t = err* + c/t +o(1/t), where c is a constant given by a simple formula. More generally, we show that if V_1,V_2,... is an exchangeable Bernoulli sequence with mixture distribution F, and the majority vote is written as M_t=median(V_1,...,V_t), then 1-E[M_t] = F(1/2)+F''(1/2)/(8t)+o(1/t) when F is sufficiently smooth.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2013 program
|
2013 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Continuing Education program, please contact the Education Department.
The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Copyright © American Statistical Association.