JSM Preliminary Online Program
This is the preliminary program for the 2009 Joint Statistical Meetings in Washington, DC.

The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


Back to main JSM 2009 Program page




Activity Number: 374
Type: Contributed
Date/Time: Tuesday, August 4, 2009 : 2:00 PM to 3:50 PM
Sponsor: Section on Bayesian Statistical Science
Abstract - #303657
Title: Bayesian Adaptive Ensemble Learning
Author(s): Sounak Chakraborty*+
Companies: University of Missouri-Columbia
Address: 209F Middlebush Hall, Columbia, MO, 65211,
Keywords: ensemble learning ; Bayesian model ; data mining ; regression ; classification ; neural network
Abstract:

Bayesian ensemble learning otherwise known as the Bayesian additive regression tree (Chipman, et al. 2005) is a pioneering work in terms of adopting the philosophy of the "slow learners" under a Bayesian setup. In this paper we describe two Bayesian ensemble methods for classification and regression. The first method is our Bayesian adaptive ensemble tree for multi-class classification. The number of trees are not fixed but we kept it free by putting a prior distribution. Thus the number of tree required to fit a model is selected adaptively and thus avoiding over fitting. The second model is based on neural network architecture. Based on an ensemble of small neural networks we develop our Bayesian ensemble network for regression. The number of required networks are selected adaptively using a prior. Success of both methods are demonstrated on simulated data sets and real data sets.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2009 program


JSM 2009 For information, contact jsm@amstat.org or phone (888) 231-3473. If you have questions about the Continuing Education program, please contact the Education Department.
Revised September, 2008