Abstract:
|
A key bottleneck in any Bayesian analysis is how various quantities of interest will be computed with respect to the posterior. In this area Markov chain Monte Carlo (MCMC) methods predominate, and among these the Gibbs sampler (GS) and Hamiltonian Monte Carlo (HMC) algorithms are the two most widely used. To employ these, Bayesian statisticians typically use off the shelf general purpose platforms (e.g. JAGS), often accessed through their preferred computing environment (e.g. R). A common question thus arises: what platform should be used? In this presentation we compare the strengths and weaknesses of various Bayesian computing platforms in the R environment. In particular, after briefly outlining the GS and HMC algorithms, we discuss five popular general-purpose implementations that use these methods: OpenBugs, JAGS, Stan, Nimble, and Greta. After presenting the basic required steps for using each, we describe results of large scale Monte Carlo simulations over many different modeling scenarios to benchmark computation time and accuracy of posterior estimation.
|