JSM 2004 - Toronto

Abstract #301461

This is the preliminary program for the 2004 Joint Statistical Meetings in Toronto, Canada. Currently included in this program is the "technical" program, schedule of invited, topic contributed, regular contributed and poster sessions; Continuing Education courses (August 7-10, 2004); and Committee and Business Meetings. This on-line program will be updated frequently to reflect the most current revisions.

To View the Program:
You may choose to view all activities of the program or just parts of it at any one time. All activities are arranged by date and time.

The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


Back to main JSM 2004 Program page



Activity Number: 386
Type: Contributed
Date/Time: Wednesday, August 11, 2004 : 2:00 PM to 3:50 PM
Sponsor: Section on Bayesian Statistical Science
Abstract - #301461
Title: Evaluation of Multilevel Decision Trees
Author(s): Erwann Rogard*+ and Andrew Gelman and Hao Lu
Companies: Columbia University and Columbia University and Thales Corporation
Address: , New York, NY, 10027,
Keywords: decision analysis ; hierarchical Bayes ; jackknife ; nested computation
Abstract:

The evaluation of decision trees under uncertainty is difficult because of the required nested operations of maximizing and averaging. Pure maximizing (for deterministic decision trees) or pure averaging (for probability trees) are both relatively simple because the maximum of a maximum is a maximum, and the average of an average is an average. But when the two operators are mixed, no simplification is possible, and one must evaluate the maximizations and averagings in a nested fashion, following the structure of the tree. Nested evaluation requires large sample sizes (for data collection) or long computation times (for simulations). An alternative to full nested evaluation is to perform a random sample of evaluations and use statistical methods to perform inference about the entire tree. We show that the most natural estimate is biased and consider three alternatives: normal-theory bias correction, the jackknife, and hierarchical Bayes inference. We explore the properties of these inferences through a simulation study and discuss general approaches to the problem.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2004 program

JSM 2004 For information, contact jsm@amstat.org or phone (888) 231-3473. If you have questions about the Continuing Education program, please contact the Education Department.
Revised March 2004