Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 416 - SLDS CSpeed 7
Type: Contributed
Date/Time: Thursday, August 12, 2021 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #318587
Title: Comparing Six Shrinkage Estimators with Large Sample Theory and Asymptotically Optimal Prediction Intervals
Author(s): Lasanthi Watagoda* and David Olive
Companies: Appalachian State University and Southern Illinois University
Keywords: Forward Selection; Lasso; PLS; Principal components regression; Ridge regression
Abstract:

Consider the multiple linear regression model Y = ?1 + ?2x2 +···+ ?p x p + e = xT ?+e with sample size n. This paper compares the six shrinkage estimators: forward selection, lasso, partial least squares, principal components regression, lasso variable selection, and ridge regression, with large sample theory and two new prediction intervals that are asymptotically optimal if the estimator ?ˆ is a consistent estimator of ?. Few prediction intervals have been developed for p > n, and they are not asymptotically optimal. For p fixed, the large sample theory for variable selection estimators like forward selection is new, and the theory shows that lasso variable selection is ?n consistent under much milder conditions than lasso. This paper also simplifies the proofs of the large sample theory for lasso, ridge regression, and elastic net.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program