Abstract:
|
Model selection has a central role in statistical data analysis. The Akaike Information Criterion (AIC) and its finite sample version AICc, as well as the Bayesian Information Criterion (BIC) are model selection rules that are used extensively in practice. AIC and its variants rely on the Kullback-Leibler distance as a loss function. Another appropriate loss function that can be used for model selection is the quadratic distance between distributions (Lindsay, Markatou and Ray, 2014). An appealing feature of quadratic distances between distributions is that they approximate a variety of other distances including the Kullback-Leibler distance. A method for model selection that we have proposed, the Quadratic Information Criterion (QIC), is derived as an estimator of the relative quadratic risk. Furthermore, the option for a choice of a kernel function and a tuning parameter makes quadratic distances, and hence the criteria based on them, very flexible and adjustable. Here, we write QIC in the form model term + penalty. We then propose a rule for choosing the tuning parameter and study its effect on the QIC model selection procedure via simulation and application to real data.
|