Online Program Home
My Program

Abstract Details

Activity Number: 278 - Emerging Ideas in Predictive Inference
Type: Invited
Date/Time: Tuesday, July 30, 2019 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #300292
Title: Relaxing the Assumptions of Model-X Knockoffs
Author(s): Lucas Janson* and Dongming Huang
Companies: Harvard University and Harvard University
Keywords: Model-X; Knockoffs; High-Dimensional; Variable Selection; False Discovery Rate; Topological Measure
Abstract:

The recent paper Cand`es et al. (2018) introduced model-X knockoffs, a method for variable selection that provably and non-asymptotically controls the false discovery rate with no restrictions or assumptions on the dimensionality of the data or the conditional distribution of the response given the covariates. The one requirement for the procedure is that the covariate samples are drawn independently and identically from a precisely-known (but arbitrary) distribution. We show that the exact same guarantees can be made without knowing the covariate distribution fully, but instead knowing it only up to a parametric model with order of np parameters, where p is the dimension and n is the total number of covariate samples (which may exceed the usual sample size of labeled samples when unlabeled samples are also available). The key is to treat the covariates as if they are drawn conditionally on their observed value for a sufficient statistic of the model. Although this idea is simple, even in Gaussian models conditioning on a sufficient statistic leads to a distribution supported on a set of zero Lebesgue measure, requiring techniques from topological measure theory.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program