Activity Number:
|
546
- Foundational Issues in Machine Learning
|
Type:
|
Topic Contributed
|
Date/Time:
|
Thursday, August 6, 2020 : 1:00 PM to 2:50 PM
|
Sponsor:
|
Section on Statistical Learning and Data Science
|
Abstract #309646
|
|
Title:
|
Joint Robust Multiple Inference on Large-Scale Multivariate Regression
|
Author(s):
|
Wen Zhou* and Wenxin Zhou and Youngseok Song
|
Companies:
|
Colorado State University and University of California, San Diego and Colorado State University
|
Keywords:
|
General linear hypotheses;
Heavy-tailed distribution;
Huber loss;
Large-scale multiple inference;
Multivariate regression;
Robustness
|
Abstract:
|
Large scale multivariate regression with many heavy-tailed and skewed responses arises in a wide range of areas. Simultaneously testing a large number of general linear hypotheses, such as multiple contrasts, reveals a variety of associations between responses and regression or experimental factors. Traditional multiple testing methods often ignore the effect of heavy-tailedness and skewness in the data and impose joint normality assumption that is arguably stringent in applications. This results in unreliable conclusions due to the lose of control on the false discovery rate and the compromise of power. Using the data-adaptive Huber regression, we propose a framework of joint robust inference of the general linear hypotheses for large scale multivariate regression. Under mild conditions, our method produces consistent estimate of the false discovery proportion and the false discovery rate at a prespecified level. We employ a bias-correction robust covariance estimator and study its exponential-type deviation inequality to provide theoretical guarantee of our method. Extensive numerical experiments demonstrate the gain in power of our method compared to widely-used competitors.
|
Authors who are presenting talks have a * after their name.