Abstract:
|
The emergence of big data has led to so-called convergence complexity analysis, which is the study of how Markov chain Monte Carlo (MCMC) algorithms behave as the sample size and/or the number of parameters in the underlying data set increase. Traditionally, drift and minorization conditions have been used to establish geometric ergodicity for individual MCMC algorithms and to study and bound their geometric convergence rates, but recent work has shown that these bounds are overly conservative and perform poorly in high-dimensional scenarios. Utilizing alternative methods, which center on Wasserstein distance and random mappings, we can successfully analyze the geometric convergence rates for a family of random effects Gibbs samplers as dimension grows.
|