Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 369 - Analysis of Random Objects
Type: Invited
Date/Time: Wednesday, August 10, 2022 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Computing
Abstract #320690
Title: Nonlinear Sufficient Dimension Reduction for Distributional Data
Author(s): Qi Zhang and Lingzhou Xue and Bing Li*
Companies: Penn State University and Penn State University and National Institute of Statistical Sciences and Penn State University
Keywords: Wasserstein distance; reproducing kernel Hilbert space; continuous embedding; universality; negative metric space; Genearlized Sliced Inverse Regression
Abstract:

We introduce a novel framework for nonlinear sufficient dimension reduction where both the predictor and the response can be probability distributions. This types of data are increasingly common in modern regression applications. We first construct a metric space based on the Wasserstein distance between distributions. In the special case where the distributions involved are one-dimensional, the Wasserstein space can be continuously embedded into a separable Hilbert space--a property that allows us to build a universal reproducing kernel on the metric space. Using this kernel we then construct reproducing kernel Hilbert spaces (RKHS) for both the predictor and the response, whose members are nonlinear functions of the distributional data. We then employ the recently developed nonlinear sufficient dimension reduction methods for RKHS to perform dimension reduction. We also extend the method to multivariate distributional data. The new method is applied to a data set involving fertility and mortality distributions.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program