Abstract:
|
Since Akaike's pioneering work forty three years ago (e.g., Akaike, 1973, and Bozdogan 1987), several approaches to model selection have been developed and currently they are further refined. Current work in this area treats information theoretic concepts and Bayesian approaches to model selection separately. In the literature of model selection, a unifying theory does not seem to exist. In this paper we introduce and present a new approach to unify Information-theoretic (IT) and Bayesian Model Selection (BMS) criteria. We achieve this by deriving a new general form of the expectation of the quadratic variation term in the Taylor series approximation to the Kullback-Leibler (KL) (1951) information which generalizes the original derivation of Akaike's (1973) classic information criterion AIC. Our starting point is extending CAICF criterion of Bozdogan (1987). We will call this Extended Consistent AIC with Fisher Information Matrix (CAICFE). We follow the same arguments as in CAICFE within the Bayesian framework, and introduce a new BMS criterion. We illustrate our new approach on real as well as simulated data sets and show the utility and versatility of these new criteria.
|