Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 215 - Contributed Poster Presentations: Section on Statistical Learning and Data Science
Type: Contributed
Date/Time: Tuesday, August 4, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #312819
Title: Robust Matrix Estimations Meet Frank-Wolfe Algorithms
Author(s): Naimin Jing* and Cheng Yong Tang and Ethan Fang
Companies: Temple University and Temple University and Penn State University
Keywords: Frank-Wolfe algorithms; Huber loss; matrix-valued parameterr; non-asymptotic properties; non-smooth criterion function; robust statistical methods

We consider matrix-valued parameter estimations with a dedicated focus on their robustness. Our setting is with large-scale structured data so that regularizations are indispensable. Though robust loss functions are expected to be effective for achieving robust estimations, their practical implementations are known difficult due to their non-smoothness. To meet the challenges, we develop an efficient scheme for solving such problems with the Frank-Wolfe algorithms. Frank-Wolfe algorithms require only the first-order derivative of the criterion function without requiring a projection, so that they are advantageous for handling problems with non-smooth robust loss functions. Our framework is broad; it extensively accommodates robust loss functions in conjunction with penalty functions in the context of matrix estimation problems. We establish non-asymptotic bounds of the matrix estimations with Huber loss and nuclear norm penalty. Our theory demonstrates the merits from using robust loss by showing that good properties are achieved even with heavy tailed distributions. We also illustrate the promising performance of our methods with extensive numerical examples and data analysis.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program