Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 309 - Interface Between Machine Learning and Uncertainty Quantification
Type: Topic Contributed
Date/Time: Wednesday, August 5, 2020 : 10:00 AM to 11:50 AM
Sponsor: Uncertainty Quantification in Complex Systems Interest Group
Abstract #309711
Title: Calibrating Uncertainties in Deep Learning
Author(s): Bhavya Kailkhura* and Jize Zhang
Companies: Lawrence Livermore National Laboratory and Lawrence Livermore National Lab
Keywords: calibration; uncertanity; deep learning

Confidence calibration – the problem of predicting probability estimates representative of the true correctness likelihood – is important for classification models in many applications. Modern neural networks, unlike those from a decade ago, are poorly calibrated. Post-processing based calibration methods, such as, temperature scaling, isotonic regression, etc. are widely used in the community to calibrate deep learning models. In this talk, first I will introduce the desiderata for a good calibration. Next, I will discuss shortcoming of existing calibration and calibration evaluation methods. Finally, I will discuss some general approaches to overcome these shortcomings.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program