Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 103 - Uncertainty Quantification for Machine Learning
Type: Topic Contributed
Date/Time: Monday, August 8, 2022 : 8:30 AM to 10:20 AM
Sponsor: Section on Physical and Engineering Sciences
Abstract #322278
Title: Conformal Prediction and Calibration Under Distribution Drift
Author(s): Aaditya Ramdas and Aleksandr Podkopaev*
Companies: Carnegie Mellon University and Carnegie Mellon University
Keywords: predictive uncertainty; conformal prediction; calibration; distribution drift
Abstract:

Quantifying the predictive uncertainty of machine learning algorithms is a topic of great theoretical and applied interest. Without additional post-processing, models often fail to accurately represent uncertainty, with a tendency to make over-confident predictions. Conformal prediction and calibration have emerged as leading candidates for performing assumption-lean predictive inference, providing rigorous guarantees without making restrictive distributional assumptions on the data, beyond assuming test data are iid or exchangeable. However, deployed machine learning models inevitably encounter changes in the input data generating distribution. I will summarize some recent progress on this front, focusing on extensions of conformal prediction and calibration to handling distribution drift.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program