|
Activity Number:
|
176
|
|
Type:
|
Invited
|
|
Date/Time:
|
Monday, July 30, 2007 : 2:00 PM to 3:50 PM
|
|
Sponsor:
|
ENAR
|
| Abstract - #307837 |
|
Title:
|
Assessing the Performance of a Symmetric Divergence Information Criterion for Selecting the Best Linear Mixed Model
|
|
Author(s):
|
Lloyd J. Edwards*+ and Anita Abraham
|
|
Companies:
|
The University of North Carolina at Chapel Hill and The University of North Carolina at Chapel Hill
|
|
Address:
|
Dept. of Biostatistics, 3105H McGavran-Greenberg, Chapel Hill, NC, 27599-7420,
|
|
Keywords:
|
Information Criterion ; Directed Divergence ; Symmetric Divergence ; Model Selection ; Mixed Model
|
|
Abstract:
|
The AIC and BIC are the most popular information criteria for model selection in the linear mixed model. Both criteria are directed divergences, i.e., they are asymmetric divergences for discriminating from observations on a true model in favor of an approximating model. Reversing the roles of the true model and approximating model provides an alternate directed divergence. The sum of the two directed divergences forms Kullback's symmetric divergence. Using simulation studies, we assess the performance of a form of Kullback's symmetric divergence proposed by Cavanaugh (1999), KIC, for selecting the best linear mixed model. We consider the KIC for selecting the best mean model and separately the best covariance model in the linear mixed model.
|