Online Program Home
My Program

Abstract Details

Activity Number: 341 - SPEED: Classification and Data Science
Type: Contributed
Date/Time: Tuesday, July 31, 2018 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #330205
Title: Differentiable Approximations of Hidden Markov Models for Variational Bayesian Inference
Author(s): Lun Yin* and John Pearson
Companies: Duke Institute for Brain Sciences and Duke University
Keywords: Hidden Markov Model; Variational Bayesian Inference; Gumbel-Softmax Distribution; Categorical Reparametrization
Abstract:

Hidden Markov models (HMM) are widely used to model dynamic processes characterized by sequences of discrete latent states. However, such models typically assume a fixed transition structure, limiting their modeling capacity when the dynamics may depend on additional variables. Recent advances in approximate inference using gradient-based optimization methods have popularized the use of neural networks to implicitly define more flexible distributions, but these models have been unable to accommodate discrete latent variables because the gradients cannot propagate through the resulting computational graph. A variety of newly proposed continuous relaxations based on the Gumbel-Softmax distribution have attempted to address this shortcoming by reparametrizing single discrete variables, but little work has examined models for sequence data. Here, we consider these relaxations as an approximate inference strategy for HMMs, with applications to time series data from a competitive multiplayer game.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program