Online Program Home
My Program

Abstract Details

Activity Number: 107 - JASA, Theory and Methods
Type: Invited
Date/Time: Monday, July 30, 2018 : 8:30 AM to 10:20 AM
Sponsor: JASA, Theory and Methods
Abstract #330977
Title: From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation
Author(s): Saharon Rosset* and Ryan Tibshirani
Companies: Tel Aviv University and Carnegie Mellon University
Keywords:
Abstract:

Much of the literature on model selection and evaluation in statistical prediction makes the "Fixed-X" assumption, where covariate values are nonrandom. However, it is often better to take a "Random-X" view, with covariates independently drawn for training and prediction. We propose a decomposition of Random-X prediction error where the randomness in covariates contributes to both bias and variance. This decomposition is general, but we focus on ordinary least squares (OLS). For OLS, the move from Fixed-X to Random-X leads to an increase in both bias and variance. When the covariates are normally distributed and there is no bias, Random-X error of OLS is explicitly computable, which yields an extension of Mallows' Cp that we call RCp. RCp also holds asymptotically for certain classes of non-normal covariates. For excess bias, we propose an estimate based on ordinary cross-validation (OCV), which we term RCp+. Theory and simulations suggest that RCp+ is typically superior to OCV, though the difference is small. Moving beyond OLS, the surprising result we get for ridge regression is that, in the heavily regularized regime, Random-X prediction error can be smaller than Fixed-X error.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program