Abstract:
|
Marginalizing over nuisance parameters can improve convergence of MCMC, as can blocking correlated parameters. This is straightforward to visualize and think about in contexts like low-dimensional multivariate normal distributions, but for mixture models it is difficult because of the non-smooth way in which label assignment and cluster-specific parameters relate to one another. In general, however, cluster assignment of observations close to one another will tend to correlate since their weight pulls a cluster's mean to a center held in common. One can explore with respect to this correlation by sampling cluster assignments for a random block of observations, not updating cluster-specific parameters for several iterations, then performing an accept-reject step for each proposed block move. Such a procedure might be characterized as a quasi-Gibbs approach since each block proposal closely approximates the target distribution. One can cater blocking to the context to push acceptance probabilities close to 1 and assure ergodicity by only using a weighted coin. The approach also yields computational advantages.
|