Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 103 - Uncertainty Quantification for Machine Learning
Type: Topic Contributed
Date/Time: Monday, August 8, 2022 : 8:30 AM to 10:20 AM
Sponsor: Section on Physical and Engineering Sciences
Abstract #322728
Title: Myths and Reality in Bayesian Deep Learning
Author(s): Andrew Gordon Wilson*
Companies: New York University
Keywords: Bayesian inference; neural networks; deep learning; approximate inference; Bayesian deep learning; generalization
Abstract:

Bayesian inference makes more sense for modern neural networks than virtually every other model class, because these models can represent many compelling and complementary explanations for data, corresponding to different settings of their parameters. However, a number of myths have emerged about Bayesian deep learning in practice: (1) it doesn't work well in practice; (2) it's computationally inefficient; (3) it only helps with uncertainty estimates but not accuracy; (4) the priors are arbitrary and bad; (5) it is outperformed by "deep ensembles"; (6) the common practice of posterior tempering, leading to "cold posteriors", means that the Bayesian posterior is poor.

In this talk, I will dispel each of these myths, and discuss the success stories, future opportunities, and genuine challenges in Bayesian deep learning.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program