Online Program Home
My Program

Abstract Details

Activity Number: 697
Type: Contributed
Date/Time: Thursday, August 4, 2016 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #319467 View Presentation
Title: Consistency of Penalized Cross-Validation for Model Selection
Author(s): Jieyi Jiang* and Steven N. MacEachern and Yoonkyung Lee
Companies: The Ohio State University and The Ohio State University and The Ohio State University
Keywords: cross-validation ; model selection ; penalized method ; consistency ; regression

Foldwise cross-validation is widely used to identify the structure of the linear model. However, it suffers from inconsistency in model selection and is inclined to select overfitting models asymptotically. Current regularized regression methods use a penalty in conjunction with a loss function for model fitting. A penalty can be used not only for model fitting but also for model evaluation. We propose the Penalized Cross-Validation Criterion: a suitable penalty term is added to the cross-validation score to ensure consistent model selection. With squared error loss, we find sufficient conditions on the penalty for consistency, balancing the false selection rates for overfitting and underfitting models. We extend the result to a more general class of loss functions. Simulation studies show the advantage of penalized cross-validation in model selection.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

Copyright © American Statistical Association