Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 333 - Advances in Bayesian Modeling
Type: Contributed
Date/Time: Tuesday, August 9, 2022 : 2:00 PM to 3:50 PM
Sponsor: International Society for Bayesian Analysis (ISBA)
Abstract #322473
Title: Blocked Gibbs Sampling for Improved Convergence in Mixture Models
Author(s): David Swanson*
Companies: University of Oslo
Keywords: mixture model; latent variable model; MCMC; Gibbs sampling
Abstract:

Marginalizing over nuisance parameters can improve convergence of MCMC, as can blocking correlated parameters. This is straightforward to visualize and think about in contexts like low-dimensional multivariate normal distributions, but for mixture models it is difficult because of the non-smooth way in which label assignment and cluster-specific parameters relate to one another. In general, however, cluster assignment of observations close to one another will tend to correlate since their weight pulls a cluster's mean to a center held in common. One can explore with respect to this correlation by sampling cluster assignments for a random block of observations, not updating cluster-specific parameters for several iterations, then performing an accept-reject step for each proposed block move. Such a procedure might be characterized as a quasi-Gibbs approach since each block proposal closely approximates the target distribution. One can cater blocking to the context to push acceptance probabilities close to 1 and assure ergodicity by only using a weighted coin. The approach also yields computational advantages.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program