Abstract:
|
Fisher's linear discriminant analysis is a classical method for classification, yet it is limited to capturing linear features only. The kernel discriminant analysis (KDA) as an extension is known to be successful in alleviating the limitation through nonlinear feature mapping. We study the geometry of nonlinear embeddings for KDA with Gaussian kernels by identifying the theoretical discriminant function given the data distribution. In order to obtain the theoretical discriminant function, we solve a generalized eigenvalue problem with between-class and within-class variation operators. For explicit description of the discriminant function, we use a particular representation for Gaussian kernels by utilizing the exponential generating function for Hermite polynomials. Our results illuminate how the data distribution and the bandwidth parmeter interplay in determination of the nonliner embedding, and provide a guideline to choice of the bandwidth parameter.
|