JSM 2012 Home

JSM 2012 Online Program

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

Online Program Home

Abstract Details

Activity Number: 178
Type: Contributed
Date/Time: Monday, July 30, 2012 : 10:30 AM to 12:20 PM
Sponsor: Biometrics Section
Abstract - #304751
Title: On Optimized Shrinkage Variable Selection in Generalized Linear Models
Author(s): Erin Melcon*+
Companies: University of California at Davis
Address: 372 Danbury Circle, Vacaville, CA, 95687, United States
Keywords: Adaptive Lasso ; Bootstrap ; Generalized Linear Models ; Lasso
Abstract:

The lasso (Tibshirani, 1996) and adaptive lasso (Zou, 2006) have become popular methods of model selection in linear models (LMs). Recently, Hastie and Park (2006) extended the lasso and adaptive lasso into generalized linear models. Both of these methods involve penalizing the regression coefficients, and shrinking a subset of them to zero. This has made the lasso and adaptive lasso popular for model selection. A difficulty for any penalization procedure is the amount of shrinkage to apply to the coefficients, or in other words, what is the ``best'' value of the penalization parameter. Current methods for selecting the penalty parameter in GLMs are cross validation and information criteria. In this paper, we extend the idea of the fence method (Jiang et. al) to GLMs, and show that the proposed bootstrap method outperforms cross validation and the information criteria.


The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.

Back to the full JSM 2012 program




2012 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.