Abstract:
|
Leave-one-out cross-validation (LOOCV) is a very popular and powerful tool for selecting tuning parameters and estimating prediction error upon future observations. Due to the demand of repeated model fits, LOOCV is usually expensive to compute. To lessen such burden, a closed-form LOOCV error was derived for the least square regression; nonetheless for classification, all the fast LOOCV methods at hand are either complicated, narrowly applicable, or inadequate to yield exact LOOCV error. In this work, we propose a magic LOOCV for margin-based classifiers, e.g., support vector machine and logistic regression. Our proposal is surprisingly simple, and it can be readily generalized to k-fold and delete-v cross-validations. We apply the magic cross-validation to kernel smooth margin classifiers. While giving the same results, our new method is significantly faster than the classic approach in extensive benchmark data sets. Besides the advantages in computation, we also demonstrate that the magic LOOCV can be used to evaluate the generalization error bound in theory.
|