Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 306 - SPEED: SPAAC SESSION II
Type: Topic-Contributed
Date/Time: Wednesday, August 11, 2021 : 3:30 PM to 5:20 PM
Sponsor: Section on Statistical Computing
Abstract #318653
Title: Robust Online Linear Discriminant Analysis for Data with Outliers
Author(s): Soshi Kawarai* and Ippei Takasawa and Hiroshi Yadohisa
Companies: Doshisha University and Doshisha University and Doshisha University
Keywords: Online learning; Linear discriminant analysis; Averaged stochastic gradient descent; Median covariance matrix; Incremental learning
Abstract:

Online learning, which updates the model sequentially using only a part of the data or newly added data, is useful for analyses that deal with large data or require urgent output of results. It has also been applied to data discrimination, such as discriminating spam emails and identifying faults in industrial products. Online linear discriminant analysis (OLDA) combines online learning and linear discriminant analysis (LDA). However, since both online learning and LDA have the disadvantage of being weak against outliers, there is a concern that the combination of the methods will be very weak against them. To overcome this problem, we use a median instead of a mean in OLDA. However, it is difficult to obtain using only newly added data. We estimate the median using averaged stochastic gradient descent (ASGD) and propose robust OLDA (ROLDA). ROLDA overcomes the weakness of both online learning and LDA. The numerical simulation results reveal that the proposed method can discriminate more accurately than LDA and OLDA for data with outliers.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program