Online Program Home
My Program

Abstract Details

Activity Number: 341 - SPEED: Classification and Data Science
Type: Contributed
Date/Time: Tuesday, July 31, 2018 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #329145 Presentation
Title: Classification via Product Conditional Density Estimates: Blending LDA and QDA
Author(s): Jiae Kim* and Steve MacEachern
Companies: and The Ohio State University
Keywords: Density estimation; LDA; QDA; regression

Fisher's linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA) are traditional methods for classification. These methods implicitly estimate the class-specific densities as multivariate normals. One way to estimate a multivariate normal density is as a product of normal theory linear regressions. LDA corresponds to regressions that are parallel across classes with equal residual variances; QDA requires neither parallel regressions nor equal residual variances. We introduce a novel classifier. We first order the features and then fit LDA-style regressions for the initial features followed by QDA-style regressions for the remaining features. The resulting class-specific densities have a common portion to their covariance matrices but different full covariance matrices. The common portion corresponds to the initial features. Technical details include how to order features, when to switch from LDA to QDA, whether to use original or transformed features and whether to model average over different reconstructions of the class-specific densities. The performance of the new classifiers is investigated with simulated and real data.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program