Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 105 - Deep Learning and Statistical Modeling with Applications
Type: Invited
Date/Time: Monday, August 3, 2020 : 1:00 PM to 2:50 PM
Sponsor: International Chinese Statistical Association
Abstract #308091
Title: Beyond Shallow Learning: New Results for Matrix Completion
Author(s): Jianqing Fan* and Yuxin Chen and Cong Ma and Yulin Yan
Companies: Princeton University and Princeton University and Princeton University and Princeton University
Keywords: Matrix completion; nuclear norm penalization; matrix factorization; asymptotic normality; debasing; optimality and efficiency

This talk is on understanding statistical rates of convergence and asymptotic normality on noisy low-rank matrix completion. One of the most popular paradigms to tackle this problem is convex relaxation, which achieves remarkable efficacy in practice. However, the theoretical support of this approach is still far from optimal in the noisy setting, falling short of explaining the empirical success. When the rank of the unknown matrix is a constant, we demonstrate that the convex programming approach achieves near-optimal estimation errors --- in terms of the Euclidean loss, the entrywise loss, and the spectral norm loss --- for a wide range of noise levels. We further establish the asymptotic normality for entries of matrix and its associated local rank factors. Remarkably, we unveil their strong oracle and adaptive properties. All of these are enabled by bridging convex relaxation with the nonconvex Burer--Monteiro approach, a seemingly distinct algorithmic paradigm that is provably robust against noise.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program