The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Online Program Home
Abstract Details
Activity Number:
|
509
|
Type:
|
Contributed
|
Date/Time:
|
Wednesday, August 1, 2012 : 10:30 AM to 12:20 PM
|
Sponsor:
|
Section on Statistical Learning and Data Mining
|
Abstract - #306242 |
Title:
|
Variable Selection via L1 Penalized LAD Method
|
Author(s):
|
Lie Wang*+
|
Companies:
|
Massachusetts Institute of Technology
|
Address:
|
77 Massachusetts Avenue, Cambridge, MA, 02139, United States
|
Keywords:
|
variable selection ;
L-1 penalized LAD ;
stable ;
Lasso ;
high-dimensional sparse model
|
Abstract:
|
We consider the high-dimensional sparse linear regression model, where the overall number of variables is larger than the number of observations. We investigate the L1 penalized least absolute deviation method. Different from most of other methods, the L1 penalized LAD method does not need any knowledge of standard deviation of the noises or any moment assumptions of the noises. Our analysis shows that the method achieves near oracle performance, i.e. i.e. with large probability, the L2 norm of the estimation error is of order $\sqrt{k \log p/n}$. The result is true for a wide range of noise distributions, even for the Cauchy distribution.
|
The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.
Back to the full JSM 2012 program
|
2012 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Continuing Education program, please contact the Education Department.