Abstract Details
Activity Number:
|
80
|
Type:
|
Contributed
|
Date/Time:
|
Sunday, August 3, 2014 : 4:00 PM to 5:50 PM
|
Sponsor:
|
Section on Statistical Learning and Data Mining
|
Abstract #312176
|
|
Title:
|
Penalized Regression and Penalty Parameter Selection on High-Dimensional Data
|
Author(s):
|
Peng Yang*+ and Soumendra Lahiri and Shuva Gupta
|
Companies:
|
North Carolina State University and North Carolina State University and North Carolina State University
|
Keywords:
|
high-dimensional ;
Penalty
|
Abstract:
|
We investigate the penalized regression and the selection of penalty parameters under high-dimensional settings (p>>n). We first propose a general class of penalty functions which includes SCAD and MCP, and prove that the resulted estimators enjoy the oracle properties under general conditions. An popular assumption in the literature is that the smallest coefficient grows faster than a certain rate, while we also explore the case when that assumption is not satisfied. Furthermore, we compare the bias and variance of several popular methods, and demonstrate the effects that small coefficients impose on the bias and covariance matrix. At the end, we show that a modified BIC-type information criterion is preferred to BIC for penalty parameter selection under high-dimensional settings.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2014 program
|
2014 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Professional Development program, please contact the Education Department.
The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Copyright © American Statistical Association.