Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 224 - Contributed Poster Presentations: Section for Statistical Programmers and Analysts
Type: Contributed
Date/Time: Tuesday, August 4, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section for Statistical Programmers and Analysts
Abstract #312196
Title: Add Variability to Neural Networks
Author(s): Sheng Yuan*
Companies:
Keywords: Bayes by Backprop; Neural Network; K-L divergence
Abstract:

In conventional neural networks, weights on each layer's neuron are usually some certain point without any variability on them. Here we proposed a probabilistic model by assuming the weights of neural networks obeys some certain distribution based on input data, first we achieve the point estimate of weights through back-propagation from minimize loss function of the neural network, then we used a Gaussian distribution to estimate the posterior distribution of model weights by minimizing the Kullback–Leibler divergence while updating the Gaussian parameter through Bayes by Backprop method, and finally gave a confidence-interval estimate for the outputs. After applying the neural network on synthetic data set and one conventional data set, we compared our method to other methods. At the end of slides we discussed the future direction for this research.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program