Online Program Home
My Program

Abstract Details

Activity Number: 247 - Sufficient Dimension Reduction and High-Dimensional Data
Type: Contributed
Date/Time: Monday, July 29, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Nonparametric Statistics
Abstract #305076 Presentation
Title: Metropolized Knockoff Sampling
Author(s): Wenshuo Wang* and Stephen Bates and Emmanuel Candes and Lucas Janson
Companies: Harvard University and Stanford and Stanford University and Harvard University
Keywords: False discovery rate; Metropolis-Hastings; Markov chains; junction trees; treewidths; Ising Model

Model-X knockoffs is a wrapper essentially transforming any importance measure into a variable selection algorithm, which discovers true effects while rigorously controlling the expected fraction of false positives. A frequently discussed challenge to apply this method is to construct knockoff variables, which are synthetic variables obeying a crucial exchangeability property with the explanatory variables under study. This paper introduces techniques for knockoff generation in great generality: we provide a sequential characterization of all possible knockoff distributions, which leads to an MCMC formulation of an exact knockoff sampler. We further show how to use conditional independence structure to speed up computations. Combining these two threads, we introduce an explicit set of powerful MCMC algorithms including the multiple-try Metropolis and empirically demonstrate their effectiveness. The techniques we develop are sufficiently rich to enable knockoff sampling in challenging models including cases where the covariates are continuous and heavy tailed, and where they follow an Ising model.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program