The application of random matrix theory to observed data structures (e.g. graphs) has led to general interest in eigenvalue concentration inequalities. Proofs for such inequalities often make the distinct eigenvalue assumption, which leads to the existence of eigengaps, followed by the application of canonical results such as the Davis-Kahan theorem.
We address the concentration of eigenvalues in the finite-dimensional random dot product graph model. More specifically, we prove under mild assumptions that the eigenvalues of the adjacency spectral embedding concentrate around the true eigenvalues even when the true eigenvalues have multiplicity. Our result extends naturally to a more general finite-dimensional signal-noise model with upper tail probability bounds.
|