Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 64 - Computational Advances in Bayesian Inference
Type: Contributed
Date/Time: Sunday, August 7, 2022 : 4:00 PM to 5:50 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #322257
Title: Data Augmentation for Bayesian Deep Learning
Author(s): Yuexi Wang* and Nicholas Polson and Vadim Sokolov
Companies: University of Chicago and University of Chicago and George Mason University
Keywords: Deep Learning; Data Augmentation; MCMC; Back-propagation; SGD
Abstract:

Deep Learning (DL) methods have emerged as one of the most powerful tools for functional approximation and prediction. While the representation properties of DL have been well studied, uncertainty quantification remains challenging and largely unexplored. Data augmentation techniques are a natural approach to provide uncertainty quantification and to integrate stochastic MCMC search with stochastic gradient descent (SGD) methods. The purpose of our paper is to show that training DL architectures with data augmentation leads to efficiency gains. To demonstrate our methodology, we develop data augmentation algorithms for a variety of commonly used activation functions: logit, ReLU and SVM. Our methodology is compared with traditional stochastic gradient descent with back-propagation. Our optimization procedure leads to a version of iteratively re-weighted least squares and can be implemented at scale with accelerated linear algebra methods providing substantial performance improvement. We illustrate our methodology on a number of standard datasets. Finally, we conclude with directions for future research.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program