JSM 2012 Home

JSM 2012 Online Program

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

Online Program Home

Abstract Details

Activity Number: 354
Type: Contributed
Date/Time: Tuesday, July 31, 2012 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Computing
Abstract - #306270
Title: Using Kullback-Leibler Divergence to Estimate the Regularized Discriminant Function
Author(s): John Beeson*+
Companies:
Address: 1226 Baylor Ave, Waco, TX, 76706, United States
Keywords: RDA ; Kullback Leibler ; discriminant ; covariance ; learning ; regularized
Abstract:

Regularized discriminant analysis, originally proposed by Friedman in 1989, is a well researched topic. It allows for more accurate classification in the presence of high dimensional, low sample size data. However, it has shortcomings in the manner to which it pools the covariance structures between classes if there exist classes that have drastically different structures. We overcome this limitation, by introducing a pre-processing step that allows for only the most structurally similar covariance matrices to be pooled. The advantage to this approach when contrasted to other methods in the literature, is it involves optimization versus cross-validation. Simulation results show that this method works well when compared to previous methods.


The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.

Back to the full JSM 2012 program




2012 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.