|
Activity Number:
|
446
|
|
Type:
|
Invited
|
|
Date/Time:
|
Wednesday, August 1, 2007 : 2:00 PM to 3:50 PM
|
|
Sponsor:
|
Section on Statistical Computing
|
| Abstract - #307964 |
|
Title:
|
Infinite Dimensional Lasso
|
|
Author(s):
|
Nathan Srebro*+ and Saharon Rosset and Ji Zhu and Grzegorz Swirszcz
|
|
Companies:
|
Toyota Technological Institute at Chicago and IBM T.J. Watson Research Center and University of Michigan and IBM T.J. Watson Research Center
|
|
Address:
|
1427 East 60th Street, Chicago, IL, 60637,
|
|
Keywords:
|
Lasso ; Kernel methods ; Sparsity
|
|
Abstract:
|
We describe a practical methodology for fitting $\ell_1$ regularized prediction models in very high, or even infinite, dimensional feature spaces. It is based on extensions of path-following methods for $\ell_1$ regularized models for the lasso, 1-norm SVM and other modeling problems. We show that the sparsity property of Lasso holds in infinite dimensions as well, and discuss learning performance. We illustrate our approach on the problem of fitting additive regression splines subject to a total variation penalty (equivalent to $\ell_1$ penalty). The resulting method offers great flexibility in fitting additive models to data, and we demonstrate its performance on one simulation example and two real-life datasets.
|