Activity Number:
|
434
- SPEED: Classification and Data Science
|
Type:
|
Contributed
|
Date/Time:
|
Tuesday, July 31, 2018 : 2:00 PM to 2:45 PM
|
Sponsor:
|
Section on Statistical Learning and Data Science
|
Abstract #332702
|
|
Title:
|
Classification via Product Conditional Density Estimates: Blending LDA and QDA
|
Author(s):
|
Jiae Kim* and Steve MacEachern
|
Companies:
|
and The Ohio State University
|
Keywords:
|
Density estimation;
LDA;
QDA;
regression
|
Abstract:
|
Fisher's linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA) are traditional methods for classification. These methods implicitly estimate the class-specific densities as multivariate normals. One way to estimate a multivariate normal density is as a product of normal theory linear regressions. LDA corresponds to regressions that are parallel across classes with equal residual variances; QDA requires neither parallel regressions nor equal residual variances. We introduce a novel classifier. We first order the features and then fit LDA-style regressions for the initial features followed by QDA-style regressions for the remaining features. The resulting class-specific densities have a common portion to their covariance matrices but different full covariance matrices. The common portion corresponds to the initial features. Technical details include how to order features, when to switch from LDA to QDA, whether to use original or transformed features and whether to model average over different reconstructions of the class-specific densities. The performance of the new classifiers is investigated with simulated and real data.
|
Authors who are presenting talks have a * after their name.