Online Program Home
  My Program

Abstract Details

Activity Number: 304 - Statistical Learning: Dimension Reduction
Type: Contributed
Date/Time: Tuesday, August 1, 2017 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #322887 View Presentation
Title: Regularized Discriminant Analysis in Presence of Cellwise Contamination
Author(s): Stephanie Aerts* and Ines Wilms
Companies: University of Liège and KU Leuven
Keywords: Cellwise robust precision matrix ; Classification ; Discriminant analysis ; Penalized estimation
Abstract:

Quadratic and Linear Discriminant Analysis (QDA/LDA) are the most often applied classification rules under normality. In QDA, a separate covariance matrix is estimated for each group. If there are more variables than observations in the groups, the usual estimates are singular and cannot be used anymore. Assuming homoscedasticity, as in LDA, reduces the number of parameters to estimate. This rather strong assumption is however rarely verified in practice. Regularized discriminant techniques that are computable in high-dimension and cover the path between the two extremes QDA and LDA have been proposed in the literature. However, these procedures rely on sample covariance matrices. As such, they become inappropriate in presence of cellwise outliers, a type of outliers that is very likely to occur in high-dimensional datasets. We propose cellwise robust counterparts of these regularized discriminant techniques by inserting cellwise robust covariance matrices. Our methodology results in a family of discriminant methods that are robust against outlying cells, cover the gap between LDA and QDA and are computable in high-dimension.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association