Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 496 - Dimension Reduction
Type: Contributed
Date/Time: Thursday, August 6, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #312798
Title: On Sufficient Dimension Reduction via Principal Asymmetric Least Squares
Author(s): Abdul-Nasah Soale* and Yuexiao Dong
Companies: Temple University and Temple University
Keywords: expectile regression; heteroscedasticity; distance correlation; nonlinear dimension reduction
Abstract:

In this paper, we introduce principal asymmetric least squares (PALS) as a unified framework for linear and nonlinear sufficient dimension reduction. Classical methods such as sliced inverse regression (Li, 1991) and principal support vector machines (Li, Artemiou and Li, 2011) may not perform well in the presence of heteroscedasticity, while our proposal addresses this limitation by synthesizing different expectile levels. Through extensive numerical studies, we demonstrate the superior performance of PALS in terms of both computation time and estimation accuracy. For the asymptotic analysis of PALS for linear sufficient dimension reduction, we develop new tools to compute the derivative of an expectation of a non-Lipschitz function.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program