Abstract:
|
Heteroscedasticity is common in real world applications and often handled by incorporating case weights into a modeling procedure. Intuitively, models fitted with different weight schemes would have a different level of complexity depending on how well the weights match the inverse of error variances. However, existing model complexity measures have been primarily studied under the assumption of equal error variances. In this work, we consider linear regression procedures and extend the classical model degrees of freedom and the predictive model degrees of freedom (a recently proposed measure of model complexity for out of sample prediction) to a heteroscedastic setting. Our analysis of the weighted least squares method reveals some interesting properties of the extended measures. In particular, we find that they depend on both the weights used for model fitting and model evaluation. Moreover, modeling heteroscedastic data with equal weights generally results in more degrees of freedom than with the optimal weights. This provides additional insights into weighted modeling procedures that are useful in risk estimation and model selection.
|