Abstract:
|
LARS (Least Angle Regression) is an algorithm to solve the l1 penalized risk for linear models. When the number of covariates is large and we are interested in interactions, the number of regression coefficients to be considered would be too large. A typical remedy is to consider a some hierarchical structure between the main effects and interactions such as the weak or strong heredity. In this talk, we propose a modified LARS algorithm which works under such an hierarchical structure, Advantages of our proposed algorithm compared to other penalized methods such as the heirNet by Bein, Taylor and Tibshirani and SHIM of Choi, Li and Zhu are that our algorithm is much faster and easy to be parallelized so that it can be applied to huge data sets. We illustrate these advantages by applying our algorithm to a data set of SNP.
|