Online Program Home
  My Program

Abstract Details

Activity Number: 488 - Hypothesis Testing When Signals Are Rare and Weak
Type: Invited
Date/Time: Wednesday, August 2, 2017 : 10:30 AM to 12:20 PM
Sponsor: Section on Risk Analysis
Abstract #325417 View Presentation
Title: Optimal variable selection and noisy adaptive compressed sensing
Author(s): Mohamed Ndaoud* and A.B. Tsybakov
Companies: CREST-ENSAE, École Polytechnique and CREST-ENSAE
Keywords: Compressed sensing ; SLOPE estimator ; exact recovery ; Hamming loss ; variable selection under sparsity ; non-asymptotic minimax risk
Abstract:

We consider variable selection (VS) based on $n$ observations from a linear regression model. The unknown parameter of the model is assumed to belong to the class $V$ of all $s$-sparse vectors in $R^p$ whose non-zero components are greater than $a > 0$. Variable selection in this context is an extensively studied problem. However, in the theory not much is known beyond the consistency of selection. For Gaussian design, which is important in the context of compressed sensing, necessary and sufficient conditions of consistency for some configurations of $n,p,s,a$ are available. They are achieved by the exhaustive search decoder, which is not realizable in polynomial time and requires the knowledge of $s$. This talk will study optimality in VS based on the Hamming risk criterion. The benchmark behavior is characterized by the minimax risk on the class $V$. For Gaussian design, we propose an adaptive algorithm independent of $s,a$, and of the noise level that nearly attains the value of the minimax risk. This algorithm is the first method, which is both realizable in polynomial time and consistent under the same (minimal) sufficient conditions as the exhaustive search decoder.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association