Online Program Home
My Program

Abstract Details

Activity Number: 506 - Advances in Multivariate Analysis for High-Dimensional, Complex Data Problems
Type: Topic Contributed
Date/Time: Wednesday, August 1, 2018 : 10:30 AM to 12:20 PM
Sponsor: Korean International Statistical Society
Abstract #329533 Presentation
Title: Supervised Dimensionality Reduction for Exponential Family Data
Author(s): Yoonkyung Lee* and Andrew Landgraf
Companies: Ohio State University and Battelle Memorial Institute
Keywords: dimension reduction; exponential family; generalized linear model; latent factors; PCA
Abstract:

Supervised dimensionality reduction techniques, such as partial least squares and supervised principal components, are powerful tools for making predictions with a large number of variables. The implicit squared error terms in the objectives, however, make it less attractive to non-Gaussian data, either in the covariates or the responses. Drawing on a connection between partial least squares and the Gaussian distribution, we show how partial least squares can be extended to other members of the exponential family - similar to the generalized linear model - for both the covariates and the responses. Unlike previous attempts, our extension gives latent variables which are easily interpretable as linear functions of the data and is computationally efficient. In particular, it does not require additional optimization for the scores of new observations and therefore predictions can be made in real time.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program