Abstract:
|
With its capability of reducing model variance while retaining bias, model averaging presents a captivating alternative to model selection for tackling model uncertainty. We propose an application of the Mallows model averaging (MMA) technique suggested by Hansen (2007), which is based on minimizing the Mallows criterion to a set of support vector machines (SVM), for both classification and regression. We compare mean squared error (MSE) of MMA estimator with those of models averaged or selected based on the other information criteria such as AIC and BIC. Although there is no single dominant approach over a range of sample sizes and signal-to-noise ratios, not only is model averaging shown to be no less competitive than model selection, but the performance of MMA estimator is better in smaller sample and higher signal-to-noise settings. Theoretical underpinnings and an illustrative application are also presented.
|