Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 221 - Topics on Deep Learning
Type: Invited
Date/Time: Wednesday, August 11, 2021 : 10:00 AM to 11:50 AM
Sponsor: IMS
Abstract #316960
Title: Overparametrization in Linear Models, and the Uniform Consistency of Cross-Validation for Ridge Regression
Author(s): Pratik Patil* and Ryan Tibshirani and Yuting Wei and Alessandro Rinaldo
Companies: Carnegie Mellon University and Carnegie Mellon University and Carnegie Mellon University and Carnegie Mellon University
Keywords: ridge regression; cross-validation; linearized neural network
Abstract:

Interpolators---estimators that achieve zero training error---have attracted growing attention in machine learning, mainly because state-of-the art neural networks appear to be models of this type. We study ridge regression in overparametrized linear models, and the "ridgeless" least squares estimator defined by taking the ridge tuning parameter to zero. We show that, under proportional asymptotics, both generalized cross-validation and leave-one-out cross-validation are consistent for estimating the true out-of-sample prediction error of ridge regression, uniformly over a range of tuning parameter values that can include zero (and even negative values). We discuss implications of this result for parameter tuning, and possible extensions, e.g., to nonlinear feature models (inspired by "linearized" two-layer neural networks).


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program