Abstract:
|
The generalized cross validation (GCV) method introduced by Craven and Wahba (1979) is popular in selecting tuning parameters in smoothing and other penalty models, such as ridge regression--the Lasso, to name a few. Its easy-to-compute and robust properties make it very competitive. However, the trace of the projection matrix in the GCV strongly indicates its linearity. Consequently, it may not perform well for nonlinear shrinkage estimators, such as the Lasso. We introduce a nonlinear GCV to accommodate the nonlinearity by computing the effective parameters, using a standard shrinkage rate by Tibshirani (1996). It takes both linear and nonlinear shrinkage effects into consideration. We also introduce a quasi-GCV for regression models with no joint likelihood, such as the penalized estimating equations (Fu, 2001) by generalizing the model deviance to weighted deviance. We demonstrate the nonlinear GCV through a linear regression model and the quasi-GCV through a GEE model with Lasso estimator in both.
|