Abstract:
|
Markov chain Monte Carlo methods for intractable likelihoods, such as the exchange algorithm, require simulations at every iteration of the Markov chain, incurring a computational cost that can be infeasible for many practical applications. Surrogate models for the likelihood have been developed to accelerate inference algorithms and better utilize parallel processing. In this talk, we propose the use of a warped, gradient-enhanced Gaussian process surrogate model for the likelihood function, which jointly models the sample means and variances of the sufficient statistics and which uses warping functions to capture covariance nonstationarity in the input parameter space. We show that both the consideration of nonstationarity and the inclusion of gradient information can be leveraged to obtain a surrogate model that is better than the stationary Gaussian process, particularly in regions where the likelihood function exhibits a phase transition. We show that the surrogate model can be used to improve the effective sample size per unit time when embedded in exact inferential algorithms, such as importance sampling and delayed-acceptance MCMC.
|