Abstract:
|
Recent findings on Bayesian Neural network have shown that Bayesian methods provide a good way to model uncertainty through the posterior distribution and is robust to over-fitting and can easily learn from small datasets. We focus on comparing two different types of Bayesian Neural Network: Variational inference (VI) or Markov chain Monte-Carlo (MCMC). My current work on variational BNN is mainly building upon the work of Hinton and Van Camp (1993), Graves (2011), Blundell (2015) and Blei (2017); and on the MCMC is building upon Ghosh (2004), Lee (2004) and Welling and The (2011). In particular, I compared the VI BNN and MCMC BNN with small simulated dataset in terms of the running time and testing accuracy. The posterior distribution of the weights obtained using both methods also been demonstrated in my paper. The results provide intuitive insights on the benefits of VI BNN, and indicate substantial computational advantages compared to MCMC BNN.
|