Abstract:
|
In this work, we consider the problem of estimating a low-rank matrix or low Tucker rank tensor from a small amount of its noisy linear measurements. A Riemannian Gauss-Newton (RGN) method is proposed for low-rank matrix/tensor estimation on the low-rank manifold. We derive the geometric objects to run the RGN and show it can be implemented efficiently. Different from the generic (super)linear convergence of RGN, we prove the first quadratic convergence guarantee of RGN in both low-rank matrix/tensor estimation under some mild conditions. A deterministic estimation error lower bound for estimation, which matches the upper bound, is provided to demonstrate the optimality of RGN. The merit of RGN is illustrated through applications in machine learning and statistics including matrix/tensor regression and tensor PCA/SVD. Finally, we provide the simulation results to corroborate our theoretical findings.
|