Online Program Home
  My Program

Abstract Details

Activity Number: 461 - SPEED: Machine Learning
Type: Contributed
Date/Time: Wednesday, August 2, 2017 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #322665 View Presentation
Title: Magic Cross-Validation with Applications in Kernel Smooth Margin Classifiers
Author(s): Boxiang Wang* and Hui Zou
Companies: and University of Minnesota
Keywords: Cross-validation ; Kernel logistic regression ; Fast leave-one-out ; Magic CV ; Margin-based classification
Abstract:

Leave-one-out cross-validation (LOOCV) is a very popular and powerful tool for selecting tuning parameters and estimating prediction error upon future observations. Due to the demand of repeated model fits, LOOCV is usually expensive to compute. To lessen such burden, a closed-form LOOCV error was derived for the least square regression; nonetheless for classification, all the fast LOOCV methods at hand are either complicated, narrowly applicable, or inadequate to yield exact LOOCV error. In this work, we propose a magic LOOCV for margin-based classifiers, e.g., support vector machine and logistic regression. Our proposal is surprisingly simple, and it can be readily generalized to k-fold and delete-v cross-validations. We apply the magic cross-validation to kernel smooth margin classifiers. While giving the same results, our new method is significantly faster than the classic approach in extensive benchmark data sets. Besides the advantages in computation, we also demonstrate that the magic LOOCV can be used to evaluate the generalization error bound in theory.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association