An important question in statistical network analysis is how to construct and estimate models of dependent network data without sacrificing computational scalability and statistical guarantees. We demonstrate that scalable estimation of random graph models with dependent edges is possible, by establishing the first consistency results and convergence rates for maximum pseudo-likelihood estimators for parameter vectors of increasing dimension based on a single observation of dependent random variables. The main results cover models of dependent random variables with countable sample spaces, and may be of independent interest. To showcase consistency results and convergence rates, we introduce a novel class of generalized beta-models with dependent edges and parameter vectors of increasing dimension. We establish consistency results and convergence rates for maximum pseudo-likelihood estimators of generalized beta-models with dependent edges, in dense- and sparse-graph settings.