Activity Number:
|
298
|
Type:
|
Topic Contributed
|
Date/Time:
|
Tuesday, August 2, 2016 : 8:30 AM to 10:20 AM
|
Sponsor:
|
Section on Statistical Learning and Data Science
|
Abstract #319781
|
|
Title:
|
Principal Quantile Regression for Sufficient Dimension Reduction with Heteroscedasticity
|
Author(s):
|
Chong Wang* and Yichao Wu and Seeing Jun Shin
|
Companies:
|
North Carolina State University and North Carolina State University and Korea University
|
Keywords:
|
heteroscedasticity ;
kernel quantile regression ;
principal quantile regression ;
sufficient dimension reduction
|
Abstract:
|
Sufficient dimension reduction (SDR) targets at reducing data dimensionality without requiring stringent model assumptions while preserving useful information contained in data. It has recently received much attention in a variety of applications. In practice, data often display heteroscedasticity which is of scientific importance in general but frequently overlooked since a primal goal of most existing statistical methods is to identify conditional mean relationship among variables. In this article, we propose a new SDR method called principal quantile regression (PQR). PQR efficiently tackles heteroscedasticity and hence shows improved performance when the data indeed possess heteroscedasticity. PQR can naturally be extended to a nonlinear version via kernel trick. Asymptotic properties are established and an efficient solution path-based algorithm is provided. Numerical examples based on both simulated and real data demonstrate the PQR's advantageous performance over existing SDR methods. PQR still performs very competitively even for the case without heteroscedasticity.
|
Authors who are presenting talks have a * after their name.