Abstract:
|
This talk is on understanding statistical rates of convergence and asymptotic normality on noisy low-rank matrix completion. One of the most popular paradigms to tackle this problem is convex relaxation, which achieves remarkable efficacy in practice. However, the theoretical support of this approach is still far from optimal in the noisy setting, falling short of explaining the empirical success. When the rank of the unknown matrix is a constant, we demonstrate that the convex programming approach achieves near-optimal estimation errors --- in terms of the Euclidean loss, the entrywise loss, and the spectral norm loss --- for a wide range of noise levels. We further establish the asymptotic normality for entries of matrix and its associated local rank factors. Remarkably, we unveil their strong oracle and adaptive properties. All of these are enabled by bridging convex relaxation with the nonconvex Burer--Monteiro approach, a seemingly distinct algorithmic paradigm that is provably robust against noise.
|