JSM Preliminary Online Program
This is the preliminary program for the 2007 Joint Statistical Meetings in Salt Lake City, Utah.

The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.



Back to main JSM 2007 Program page




Activity Number: 4
Type: Invited
Date/Time: Sunday, July 29, 2007 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract - #307808
Title: The Adaptive Lasso and Its Oracle Properties
Author(s): Hui Zou*+
Companies: The University of Minnesota
Address: 362 Ford Hall, Minneapolis, MN, 55455,
Keywords: Adaptive Lasso ; Oracle Propertties ; Model Selection
Abstract:

The lasso is famous for its ability to automatically produce a sparse subset model. The LARS algorithm further facilitates the applications of the lasso in practice. The lasso method now seems to become the default choice for building a sparse model. In this talk, we first show a necessary condition for the lasso variable selection to be consistent. We present some examples in which the lasso is inconsistent for variable selection. We then propose a new version of the lasso, called the adaptive lasso, where adaptive weights are used for penalizing different coefficients in the lasso penalty. We show that the adaptive lasso is consistent in variable selection, and in addition, it possesses the oracle properties.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2007 program

JSM 2007 For information, contact jsm@amstat.org or phone (888) 231-3473. If you have questions about the Continuing Education program, please contact the Education Department.
Revised September, 2007