Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 162 - Recent Development in Data Fusion
Type: Topic Contributed
Date/Time: Tuesday, August 4, 2020 : 10:00 AM to 11:50 AM
Sponsor: Section on Bayesian Statistical Science
Abstract #312338
Title: A Bayes Perspective on Model Average Predictors
Author(s): Tri Le* and Bertrand Clarke
Companies: Mercer Univ - Atlanta and U. Nebraska-Lincoln
Keywords: Bagging; Bayes Model Average; Boosting; Prediction; Random forests; Stacking
Abstract:

We study the Bayesian aspects of the five popular model averages for prediction in complex problems -- Bayes model averaging (BMA), stacking, bagging, random forests, and boosting. In all five cases we provide conditions under which the model average predictor performs as well or better than any of its components. This is well known empirically, especially for complex problems, although few theoretical results seem to be available. In addition, we show all five of the model averages can be regarded as Bayesian at least in a limiting sense. BMA is Bayesian by construction. Stacking is the Bayes optimal action in an asymptotic sense under several loss functions. We show that bagging is asymptotically equivalent to a pseudo-BMA. Random forests is a special case of bagging and hence likewise asymptotically Bayes. Boosting is a limit of Bayes optimal boosting classifiers. Our work is mostly in the regression context because that is where model averaging techniques differ most often from current practice.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program