Online Program Home
My Program

Abstract Details

Activity Number: 314
Type: Contributed
Date/Time: Tuesday, August 2, 2016 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #319141 View Presentation
Title: Hierarchical Sparse Modeling: A Choice of Two Regularizers
Author(s): Xiaohan Yan* and Jacob Bien
Companies: Cornell University and Cornell University
Keywords: hierarchical sparsity ; convex regularization ; group lasso ; latent overlapping group lasso

Demanding sparsity in estimated models has become a routine practice in statistics. Hierarchical sparse modeling (HSM) refers to situations in which the sparsity patterns attained honor that one set of parameters be set to zero whenever another is set to zero. In recent years, numerous papers have developed convex regularizers for this form of sparsity structure arising in areas including interaction modeling, time series, and covariance estimation. In this paper, we observe that these methods fall into two frameworks: group lasso and latent overlapping group lasso, which have not been systematically compared in the context of HSM. The purpose of this paper is to provide a side-by-side comparison of these two frameworks for HSM in terms of their statistical properties and computational efficiency. We call attention to a problem with the group lasso framework and provide new insights into the other, which can greatly improve its computational performance. Finally, we compare the two methods in the context of covariance estimation, where we introduce a new sparsely-banded estimator, which we show achieves the statistical advantages of an existing method but is simpler to compute.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

Copyright © American Statistical Association