Abstract:
|
We present the basics of a new sampling algorithm - Hamiltonian Sequential Monte Carlo (HSMC), which combines ideas from Hamiltonian Monte Carlo and Sequential Monte Carlo, allowing us to move from an initial, easy-to-sample-from distribution, to the distribution of interest via a sequence of intermediate distributions. The algorithm produces a sample from the desired distribution, as well as an estimate of the ratio of the normalizing constants of the final and the initial distributions. We show that for a particular choice of the transition kernels, the HSMC algorithm performs better in terms of mean squared error of the estimate of the ratio of the normalizing constants, compared to other standard algorithms. This is achieved through bias-variance trade off. We discuss some of the properties of the new algorithm and present simulation results for couple of toy examples, as well as for a 20-dimensional linear regression, where we estimate the Bayes factor for two competing models.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.