Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 215 - Contributed Poster Presentations: Section on Statistical Learning and Data Science
Type: Contributed
Date/Time: Tuesday, August 4, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #313546
Title: Deriving and Generalizing Kernel Linear Discriminant Analysis for Multiple Cases
Author(s): Jackson Maris*
Keywords: kernel function; objective function; linear discriminant; matrix; optimization; generalization

The mathematical background of Kernel Linear Discriminant Analysis, or KLDA, is thoroughly derived and examined. We extend the mathematics of this dimension reduction technique to a general case to see how it reduces the data's dimensions down to c-1 dimensions, where c is the number of classes in the data, and how it separates the classes by using a kernel function. To achieve the end goal of class separation, several assumptions along the way will be proven, especially involving the properties of the matrices involved in the objective function used to observe the resulting projections.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program