Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 585 - Bayesian Neural Networks
Type: Topic Contributed
Date/Time: Thursday, August 6, 2020 : 3:00 PM to 4:50 PM
Sponsor: International Society for Bayesian Analysis (ISBA)
Abstract #313534
Title: Practical Bayesian Inference for Shallow CNNs in NLP
Author(s): Jacob Hinkle* and Devanshu Agrawal and Theodore Papamarkou
Companies: Oak Ridge National Lab and Oak Ridge National Lab and Oak Ridge National Laboratory
Keywords: Bayesian inference; neural network; natural language processing; variational inference; Gaussian processes; convolutional neural networks
Abstract:

Inspired by the architectures of deep Bayesian neural networks (BNN), the deep Gaussian process (DGP) framework has arisen as a way to leverage composition to provide a flexible model that admits tractable Bayesian inference. Recently, the connection between infinitely wide Bayesian neural networks with finite bottleneck layers and DGPs has been established. This correspondence hints at a novel way to approach BNNs, by using the inducing point framework commonly used to train Gaussian process models, as opposed to conventional parameter-centric approaches to BNNs such as Bayes-by-Backprop. In this work, we adopt this perspective to design a fully Bayesian inference method for a shallow convolutional neural network designed for natural language processing (NLP) tasks, and show its practicality on common datasets. Our method employs both inducing points and parametric variational methods, owing to the unique structure of the embedding layer found in NLP networks.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program