Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 403 - Sufficient Dimension Reduction and Variable Selection for High-Dimensional Inference
Type: Topic Contributed
Date/Time: Wednesday, August 5, 2020 : 1:00 PM to 2:50 PM
Sponsor: Section on Nonparametric Statistics
Abstract #312236
Title: On Sufficient Dimension Reduction for Functional Data via Weak Conditional Moments
Author(s): Jun Song* and Bing Li
Companies: UNC Charlotte and Pennsylvania State University
Keywords: sufficient dimension reduction; functional data; weak conditional moment; Carleman operator; dimension reduction ; reproducing kernel Hilbert space

A general theory and estimation methods for functional linear sufficient dimension reduction are developed, where both the predictor and the response can be random functions, or even vectors of functions. Unlike the existing dimension reduction methods, our approach does not rely on the estimation of conditional mean and conditional variance. Instead, it is based on a new statistical construction --the weak conditional expectation, which is based on Carleman operators and their inducing functions. The weak conditional expectation is a generalization of conditional expectations. Its key advantage is to replace the projection on to an L2-space -- which defines conditional expectation -- by projection on to an arbitrary Hilbert space, while still maintaining the unbiasedness of the related dimension reduction methods. This flexibility is particularly important for functional data, because attempting to estimate a full-fledged conditional mean or conditional variance by slicing or smoothing over the space of vector-valued functions may be inefficient due to the curse of dimensionality. We evaluated the performances of our new methods by simulation and in several applied settings.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program