Abstract:
|
We propose a new information criterion for model selection. It has the benefits of the two well-known model selection techniques, the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). For a well-specified model class, BIC is typically consistent, and so is the new criterion. For a mis-specified model class, AIC is known to be asymptotically efficient in the sense that its predictive performance is asymptotically equivalent to the best offered by the candidate models; in this case, the new criterion behaves in a similar manner. While the optimality of AIC and BIC is susceptible to model specification, the proposed criterion can be automatically consistent in well-specified settings, and asymptotically efficient in mis-specified settings. In practice where the observed data is given without any prior information about the model specification, the proposed criterion can be more flexible and reliable compared with classical approaches. We also extend the criterion to high-dimensional variable selection where the sample size is much smaller than the number of variables.
|