Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 319 - SLDS CSpeed 6
Type: Contributed
Date/Time: Wednesday, August 11, 2021 : 3:30 PM to 5:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #318197
Title: Low-Rank Matrix/Tensor Estimation via Riemannian Gauss-Newton: Statistical Optimality and Second-order Convergence
Author(s): Wen Huang and Xudong Li and Anru Zhang and Yuetian Luo*
Companies: Xiamen University and Fudan University and University of Wisconsin-Madison and University of Wisconsin-Madison
Keywords: Low-rank matrix/tensor estimation; Riemannian optimization; quadratic convergence; statistical optimality
Abstract:

In this work, we consider the problem of estimating a low-rank matrix or low Tucker rank tensor from a small amount of its noisy linear measurements. A Riemannian Gauss-Newton (RGN) method is proposed for low-rank matrix/tensor estimation on the low-rank manifold. We derive the geometric objects to run the RGN and show it can be implemented efficiently. Different from the generic (super)linear convergence of RGN, we prove the first quadratic convergence guarantee of RGN in both low-rank matrix/tensor estimation under some mild conditions. A deterministic estimation error lower bound for estimation, which matches the upper bound, is provided to demonstrate the optimality of RGN. The merit of RGN is illustrated through applications in machine learning and statistics including matrix/tensor regression and tensor PCA/SVD. Finally, we provide the simulation results to corroborate our theoretical findings.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program