|
Activity Number:
|
337
|
|
Type:
|
Contributed
|
|
Date/Time:
|
Tuesday, July 31, 2007 : 2:00 PM to 3:50 PM
|
|
Sponsor:
|
Section on Bayesian Statistical Science
|
| Abstract - #309934 |
|
Title:
|
Kullback-Leibler Divergence for Design and Model Selection
|
|
Author(s):
|
Chenpin Wang*+
|
|
Companies:
|
University of Texas Health Science Center San Antonio
|
|
Address:
|
13507 Charter Bend Dr, San Antonio, TX, 78231,
|
|
Keywords:
|
Kullback-Leibler Divergence ; Fisher information ; optimal design ; model comparison
|
|
Abstract:
|
We consider two Kullback-Leibler distances (KLDs) suitable for comparing model fit or study designs in the Bayesian framework: (a) the posterior mean of a weighted KLD between the predictive distribution of a diagnostic statistic $T_n$ under a model $g$ with a correct specification of its first two moments and that under an assumed model $f$, and (b) the KLD between the posterior distributions of certain model parameters associated with $T_n$ as described in (a). We prove the asymptotic equivalence between the KLDs in (a) and (b) under certain regularity conditions and that the estimation of the first moment of $T_n$ is unbiased under the assumed $f$. Superiority of these two KLDs to both the Bayesian and frequentist D-optimal criteria is demonstrated with examples when the assumed $f$ is subject to biased estimation of the second moment of $T_n$, but not its first moment.
|