Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 244 - Advances in Statistical Machine Learning
Type: Contributed
Date/Time: Tuesday, August 9, 2022 : 8:30 AM to 10:20 AM
Sponsor: IMS
Abstract #323229
Title: Perturbation Analysis of Randomized SVD and Its Applications to High-Dimensional Statistics
Author(s): Yichi Zhang* and Minh Tang
Companies: North Carolina State University and North Carolina State University
Keywords: Randomized SVD; Power iteration; Perturbation analysis; Spectral method; Random Graph Inference
Abstract:

Randomized singular value decomposition (RSVD) is a class of computationally efficient randomized algorithms for computing the truncated SVD of large data matrices. In this paper, we study the statistical properties of RSVD under a general ``signal-plus-noise'' framework. We first derive upper bounds for the two-norm and two-to-infinity distance between the approximate singular vectors of the observed matrix by RSVD, and the true singular vectors of the signal matrix. A phase transition phenomenon is observed in which the perturbation error decreases as the number of power iterations used by the RSVD increases. We then show that the thresholds at which the phase transition occurs are sharp whenever the trace of the noise matrices satisfy a certain growth condition. We illustrate our theoretical results by deriving nearly-optimal performance guarantees for RSVD when applied to three statistical inference problems, namely, community detection, matrix completion, and principal component analysis with missing data.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program