Abstract:

We consider estimating singular vectors of a lowrank n by d matrix corrupted by noise with $n,d/to/infty$. SVD could be suboptimal when distribution of singular vector entries is given. In such cases previous work characterized the optimal estimation error in the proportional asymptotics $n/d\to\delta\in (0,\infty)$. Here we consider $n/d\to \{0,\infty\}$, and reveal regimes in which recovery is possible for onesided singular vectors while impossible for the other. From an algorithmic perspective, we introduce a class of algorithms termed OGFOM, which includes classical first order algorithms like gradient descent and its variants.We prove a sharp characterization of the optimal error achieved by any OGFOM. Next we provide a tight lower bound on the estimation error of onesided singular vectors achieved by any estimator, by computing exact asymptotics of a suitable free energy functional. We show that in terms of recovering onesided singular vectors, our model is equivalent to a symmetric signalplusnoise model.As an application, we consider highdimensional Gaussian mixture model, and locate the exact minimum separation threshold for partial recovery with target accuracy.
