Online Program Home
  My Program

Abstract Details

Activity Number: 461 - SPEED: Machine Learning
Type: Contributed
Date/Time: Wednesday, August 2, 2017 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #324797 View Presentation
Title: The Geometry of Nonlinear Embeddings in Discriminant Analysis with Gaussian Kernels
Author(s): Jiae Kim* and Yoonkyung Lee
Companies: Ohio State University and The Ohio State University
Keywords: Discriminant analysis ; Gaussian kernel ; Hermite polynomial
Abstract:

Fisher's linear discriminant analysis is a classical method for classification, yet it is limited to capturing linear features only. The kernel discriminant analysis (KDA) as an extension is known to be successful in alleviating the limitation through nonlinear feature mapping. We study the geometry of nonlinear embeddings for KDA with Gaussian kernels by identifying the theoretical discriminant function given the data distribution. In order to obtain the theoretical discriminant function, we solve a generalized eigenvalue problem with between-class and within-class variation operators. For explicit description of the discriminant function, we use a particular representation for Gaussian kernels by utilizing the exponential generating function for Hermite polynomials. Our results illuminate how the data distribution and the bandwidth parmeter interplay in determination of the nonliner embedding, and provide a guideline to choice of the bandwidth parameter.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association