Activity Number:
|
147
|
Type:
|
Contributed
|
Date/Time:
|
Monday, August 12, 2002 : 2:00 PM to 3:50 PM
|
Sponsor:
|
Section on Bayesian Stat. Sciences*
|
Abstract - #301799 |
Title:
|
Bayes Estimate and Inference for Entropy and Information Index of Fit
|
Author(s):
|
Ehsan Soofi*+ and Thomas Mazzuchi and Refik Soyer
|
Affiliation(s):
|
University of Wisconsin, Milwaukee and George Washington University and George Washington University
|
Address:
|
P.O. Box 742, Milwaukee, Wisconsin, 53201, USA
|
Keywords:
|
Nonparametric Bayes ; Model selection ; Kullback-Leibler information
|
Abstract:
|
Akaike information criteria and its descendents serve the purpose of model comparison only, and do not provide diagnostic about the model fit. We will present an overview of a new approach to Bayesian inference about the model fit and parameters. This approach combines ideas that are well-known in information theoretic statistics (maximum entropy characterization of the model) and Bayesian statistics (Dirichlet process prior) and is referred to as Maximum Entropy Dirichlet (MED). The procedure assumes that the data generating distribution is unknown, uses moments to derive a tentative model, and incorporates uncertainty about the model. An estimate of the entropy of the unknown distribution is needed. We introduce a class of entropy estimates and develop a Bayes entropy estimate. The consistency of the Bayes entropy estimate and the information index is shown. The MED is a computer-intensive procedure that generates prior and posterior distributions of the entropy and an information index for assessing the model fit. As byproducts, MED also produces priors and posteriors that map uncertainty about the model parameters and the moments. Applications will be presented.
|