Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 496 - Dimension Reduction
Type: Contributed
Date/Time: Thursday, August 6, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #313245
Title: A Supervised Framework for Linear Dimension Reduction Induced by Hypothesis Testing
Author(s): Kisung You* and Lizhen Lin
Companies: University of Notre Dame and University of Notre Dame
Keywords: dimension reduction; hypothesis testing; big data; multivariate analysis of variance; supervised learning

Dimension reduction is one of the most highlighted problems in high-dimensional data analysis and visualization. Algorithms are usually designed to optimize cost functions that formalize certain criteria such as maximizing projected variance for the case of principal component analysis. We start by examining the cost function of Fisher’s linear discriminant analysis (LDA) and reveal that the objective function is closely related to well-known statistics from multivariate analysis of variance (MANOVA). Motivated by the association, we propose a general framework for linear dimension reduction under the supervised regime to find an optimal embedding or projection matrix that carries information from a class of hypothesis testing procedures. As a black-box algorithm for potentially non-differentiable and complex cost functions, simulated annealing (SA) is modified on Stiefel manifold of projection matrices. We also present extension to multi-class scenarios, merging heterogeneous information from distinct procedures, and scalable computation via median of subset projections on Grassmann manifold. Simulation results show that the proposed method is competitive to existing methods.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program