Abstract:
|
We consider estimating singular vectors of a low-rank n by d matrix corrupted by noise with $n,d/to/infty$. SVD could be suboptimal when distribution of singular vector entries is given. In such cases previous work characterized the optimal estimation error in the proportional asymptotics $n/d\to\delta\in (0,\infty)$. Here we consider $n/d\to \{0,\infty\}$, and reveal regimes in which recovery is possible for one-sided singular vectors while impossible for the other. From an algorithmic perspective, we introduce a class of algorithms termed OGFOM, which includes classical first order algorithms like gradient descent and its variants.We prove a sharp characterization of the optimal error achieved by any OGFOM. Next we provide a tight lower bound on the estimation error of one-sided singular vectors achieved by any estimator, by computing exact asymptotics of a suitable free energy functional. We show that in terms of recovering one-sided singular vectors, our model is equivalent to a symmetric signal-plus-noise model.As an application, we consider high-dimensional Gaussian mixture model, and locate the exact minimum separation threshold for partial recovery with target accuracy.
|