Online Program Home
My Program

Abstract Details

Activity Number: 141 - Statistical Understanding of Deep Learning
Type: Invited
Date/Time: Monday, July 29, 2019 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #300313 Presentation
Title: ALMOND: Adaptive Latent Modeling and Optimization via Neural Networks and Langevin Diffusion
Author(s): Xiao Wang* and Yixuan Qiu
Companies: Purdue University and Carnegie Mellon University
Keywords: Deep learning; Langevin dynamics; Latent variable models; Variational autoencoder

Latent variable models cover a broad range of statistical and machine learning models, such as Bayesian models, linear mixed models, and Gaussian mixture models. Existing methods often suffer from two major challenges in practice: (a) A proper latent variable distribution is difficult to be specified; (b) Making an exact likelihood inference is formidable due to the intractable computation. We propose a novel framework for the inference of latent variable models that overcomes these two limitations. This new framework allows for a fully data-driven latent variable distribution via deep neural networks, and the proposed stochastic gradient method, combined with the Langevin algorithm, is efficient and suitable for complex models and big data. We provide theoretical results for the Langevin algorithm, and establish the convergence analysis of the optimization method. This framework has demonstrated superior practical performance through simulation studies and a real data analysis.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program