Abstract:
|
We propose a new sampling algorithm combining two quite powerful ideas in the Markov chain Monte Carlo literature - adaptive Metropolis sampler and two-stage Metropolis-Hastings sampler. The proposed sampling method is particularly very useful for high-dimensional posterior sampling in Bayesian model calibration which involves a computationally expensive forward model. In the first stage of the algorithm, an adaptive proposal is used based on the previously sampled states, and the corresponding acceptance probability is computed based on an approximated posterior involving an inexpensive surrogate model. The expensive target posterior using the true forward model is evaluated in the second stage only if the proposal is accepted in the inexpensive first stage. While the adaptive nature of the algorithm guarantees faster convergence of the chain and very good mixing properties, the two-stage approach helps in rejecting the bad proposals in the inexpensive first stage, making the algorithm computationally efficient. As the proposals are dependent on the previous states the chain loses its Markov property, but we prove that it retains the desired ergodicity property.
|