Abstract:
|
One of the bottlenecks for achieving scalability of Bayesian models to big data and high-dimensionality is on how to sample the complex posterior distributions. Standard MCMC algorithms often suffer from slow or poor mixing due to the computational burden arising from sampling the posterior distribution with big data. This lecture will provide an overview on various methods and algorithms for scalable Bayesian inference with a particular focus on MCMC-based approaches. I will start with covering some basics on Bayesian inference, MCMC algorithms, complexity theory or mixing times and then discuss divide and conquer or parallel approaches, MCMC with approximate kernels, geometric MCMC and so on.
|