Online Program Home
My Program

Abstract Details

Activity Number: 545 - Towards Perfect and Scalable Distributional Computation
Type: Invited
Date/Time: Wednesday, July 31, 2019 : 2:00 PM to 3:50 PM
Sponsor: IMS
Abstract #300117
Title: Fiducial Selector: Scalable Statistical Inference for High-Dimensional Regression Problems
Author(s): Thomas C. M. Lee* and Jan Hannig and Randy Lai and Chunzhe Zhang
Companies: UC Davis and UNC Chapel Hill and U of Maine and UC Davis
Keywords: confidence intervals; de-biasing; fast computations; uncertainty quantification
Abstract:

We consider statistical inference for high dimensional regression. Our approach is based on Fisher's idea of fiducial inference, and is composed of two main steps. Using Fisher's idea, in the first step we generate a set of so-called fiducial sample for the parameters of interest - this step can be made very fast by using an appropriate high dimensional regression method. Then in the second step we apply a de-biasing operation to the sample generated in the first step. Our de-biasing formula has a closed-form expression so this step is also very fast. The resulting fiducial sample is (approximately) unbiased and plays a similar role as with the posterior sample in the Bayesian context, which can be used for example to form point estimates and construct confidence intervals. Since both steps are fast, our methodology can be straightforwardly scaled to handle large data sets.

This is joint work with Jan Hannig, Randy Lai and Chunzhe Zhang


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program