Online Program Home
My Program

Abstract Details

Activity Number: 254 - Contributed Poster Presentations: Section on Statistical Learning and Data Science
Type: Contributed
Date/Time: Monday, July 30, 2018 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #326980
Title: Empirical Evaluation for Platt Scaling and Isotonic Regression
Author(s): Weihua Shi*
Companies: SAS Institute, Inc.
Keywords: probability calibration; support vector machine; random forest; neural network; boosted decision tree

Classifiers are often expected for estimating the classification probabilities (CP) besides predicting classification labels. However, many classifiers output biased CP estimates due to misspecified models, problematic model-fittings, or improper CP-estimating algorithms. Two popular post-fitting methods for CP calibration are the Platt scaling and the isotonic regression. This work evaluated the CP performance for the two methods for the following four classifiers: Support Vector Machine (SVM) Boosted Decision Tree (GB) Random Forest (RF) Neural Network (NN) The overall CP performance are measured by using the logloss score, Brier's score, area under the ROC curve, classification accuracy, precision, recall, and F-measure. This study used SAS Software to fit all the four classifiers and measure their relevant CP performance. Our empirical results confirm that Platt scaling is more accurate than isotonic regression if the intensity-CP distortion is a sigmoid function, otherwise isotonic regression is more flexible for correcting any monotonic distortion.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program