Online Program Home
My Program

Abstract Details

Activity Number: 255 - Contributed Poster Presentations: Section on Statistical Computing
Type: Contributed
Date/Time: Monday, July 29, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract #303058
Title: Stochastic Gradient MCMC for State Space Models
Author(s): Christopher Aicher*
Companies: University of Washington
Keywords: stochastic gradient; Markov chain Monte Carlo; Bayesian inference; state space model; hidden Markov models; time series

State space models (SSMs) are a flexible approach to modeling complex time series. However, inference in SSMs is often computationally prohibitive for long time series. Stochastic gradient MCMC (SGMCMC) is a popular method for scalable Bayesian inference for large independent data. Unfortunately when applied to dependent data, such as in SSMs, SGMCMC's stochastic gradient estimates are biased as they break crucial temporal dependencies. To alleviate this, we propose stochastic gradient estimators that control this bias by performing additional computation in a `buffer' to reduce breaking dependencies. Furthermore, we derive error bounds for this bias and show a geometric decay under mild conditions. Using these estimators, we develop novel SGMCMC samplers for discrete, continuous and mixed-type SSMs. Our experiments on real and synthetic data demonstrate the effectiveness of our SGMCMC algorithms compared to batch MCMC, allowing us to scale inference to long time series with millions of time points

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program