Abstract Details
Activity Number:
|
80
|
Type:
|
Contributed
|
Date/Time:
|
Sunday, August 3, 2014 : 4:00 PM to 5:50 PM
|
Sponsor:
|
Section on Statistical Learning and Data Mining
|
Abstract #311187
|
View Presentation
|
Title:
|
Lasso with Long Memory Regression Errors
|
Author(s):
|
Abhishek Kaul*+
|
Companies:
|
Michigan State University
|
Keywords:
|
Lasso ;
Long memory dependence ;
Sign consistency ;
Oracle inequality ;
asymptotic normality
|
Abstract:
|
Lasso is a computationally efficient approach to model selection and estimation, and its properties are well studied under linear regression setup with errors being i.i.d. We study the case, where the regression errors form a long memory moving average process. In the case where the design is non random, we establish a finite sample oracle inequality for the Lasso solution. We then show the asymptotic sign consistency in this setup. These results are established in the high dimensional setup (p>n) where p (no. of parameters) can be increasing exponentially with n (sample size). Finally, we show the consistency and asymptotic normality of Lasso in the case where p is fixed and is less than n. The performance of Lasso in the present setup is also analyzed with a simulation study.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2014 program
|
2014 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Professional Development program, please contact the Education Department.
The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Copyright © American Statistical Association.