Conference Program

Return to main conference page

All Times ET

Thursday, June 9
Machine Learning
Advancements in Machine Learning
Thu, Jun 9, 1:15 PM - 2:45 PM
Cambria
 

A Convex-Nonconvex Strategy for Grouped Variable Selection (310088)

Presentation

Eric C. Chi, Rice University 
*Xiaoqian Liu, North Carolina State University 
Aaron J. Molstad, University of Florida 

Keywords: Sparse linear regression, Convex optimization, Convexity-preserving nonconvex penalization, High-dimensional data analysis.

This paper deals with the grouped variable selection problem. A widely used strategy is to augment the negative log-likelihood function with a sparsity-promoting penalty. Existing methods include the group Lasso, group SCAD, and group MCP. The group Lasso solves a convex optimization problem but is plagued by underestimation bias. The group SCAD and group MCP avoid this estimation bias but require solving a nonconvex optimization problem that may be plagued by suboptimal local optima. In this work, we propose an alternative method based on the generalized minimax concave (GMC) penalty, which is a folded concave penalty that maintains the convexity of the objective function. We develop a new method for grouped variable selection in linear regression, the group GMC, that generalizes the strategy of the original GMC estimator. We present an efficient algorithm for computing the group GMC estimator and also prove properties of the solution path to guide its numerical computation and tuning parameter selection in practice. We establish error bounds for both the group GMC and original GMC estimators. A rich set of simulation studies and a real data application indicate that the proposed group GMC approach outperforms existing methods in several different aspects under a wide array of scenarios.