Online Program Home
My Program

Abstract Details

Activity Number: 480 - Model Testing and Prediction
Type: Contributed
Date/Time: Wednesday, August 1, 2018 : 8:30 AM to 10:20 AM
Sponsor: Business and Economic Statistics Section
Abstract #328698
Title: Component-wise Discrete Asymmetric AdaBoost for High-dimensional Binary Quantile Regression
Author(s): Tae-Hwy Lee* and Jianghao Chu and Aman Ullah
Companies: Univ of California, Riverside and University of California, Riverside and University of California, Riverside
Keywords: AdaBoost; exponential loss; Asymmetric AdaBoost; asymmetric exponential loss; binary quantiles; false negative or false positve

We generalize Adaptive Boosting or AdaBoost, introduced by Freund and Schapire (1996), to solve the high-dimensional binary quantile regression problems. The existing AdaBoost may be understood as an algorithm to solve the high-dimensional binary median regression problem. We extend the theory of Friedman, Hastie, and Tibshirani (2000) who show that AdaBoost builds an additive logistic regression model via minimizing the "exponential loss". Generalizing the exponential loss function to an asymmetric exponential loss function, we introduce "Asymmetric AdaBoost". While the exisitng (symmetric) AdaBoost penalizes false positive (FP) prediction and false negative (FN) prediction equally, we show that the Asymmetric AdaBoost algorithm penalizes them differently (asymmetrically).

Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program