Online Program

Return to main conference page
Friday, October 19
Fri, Oct 19, 11:45 AM - 1:15 PM
Caprice 3-4
Speed Session 3

The Geometry of Nonlinear Embeddings in Kernel Discriminant Analysis Abstract (305013)

*Jiae Kim, The Ohio State University 

Fisher's linear discriminant analysis is a classical method for classification, yet it is limited to capturing linear features only. The kernel discriminant analysis (KDA) as an extension is known to be successful in alleviating the limitation through nonlinear feature mapping. We study the geometry of nonlinear embeddings for KDA with Gaussian kernels by identifying the theoretical discriminant function given the data distribution. In order to obtain the theoretical discriminant function, we solve a generalized eigenvalue problem with between-class and within-class variation operators. For explicit description of the discriminant function, we use a particular representation for Gaussian kernels by utilizing the exponential generating function for Hermite polynomials. Our results illuminate how the data distribution and the bandwidth parmeter interplay in determination of the nonliner embedding, and provide a guideline to choice of the bandwidth parameter.