Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 416 - SLDS CSpeed 7
Type: Contributed
Date/Time: Thursday, August 12, 2021 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #317975
Title: Efficient Designs of SLOPE Penalty Sequences in Finite Dimension
Author(s): Yiliang Zhang* and Zhiqi Bu
Companies: University of Pennsylvania and University of Pennsylvania
Keywords: high-dimensional linear model; SLOPE; Approximate Message Passing
Abstract:

In linear regression, SLOPE is a new convex analysis method that generalizes the Lasso via the sorted $\ell_1$ penalty: larger fitted coefficients are penalized more heavily. This magnitude-dependent regularization requires an input of penalty sequence $\blam$, instead of a scalar penalty as in the Lasso case, thus making the design extremely expensive in computation. In this paper, we propose two efficient algorithms to design the possibly high-dimensional SLOPE penalty, in order to minimize the mean squared error. For Gaussian data matrices, we propose a first order Projected Gradient Descent (PGD) under the Approximate Message Passing regime. For general data matrices, we present a zero-th order Coordinate Descent (CD) to design a sub-class of SLOPE, referred to as the $k$-level SLOPE. Our CD allows a useful trade-off between the accuracy and the computation speed. We demonstrate the performance of SLOPE with our designs via extensive experiments on synthetic data and real-world datasets.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program