Abstract:
|
The mathematical background of Kernel Linear Discriminant Analysis, or KLDA, is thoroughly derived and examined. We extend the mathematics of this dimension reduction technique to a general case to see how it reduces the data's dimensions down to c-1 dimensions, where c is the number of classes in the data, and how it separates the classes by using a kernel function. To achieve the end goal of class separation, several assumptions along the way will be proven, especially involving the properties of the matrices involved in the objective function used to observe the resulting projections.
|