We generalize Adaptive Boosting or AdaBoost, introduced by Freund and Schapire (1996), to solve the high-dimensional binary quantile regression problems. The existing AdaBoost may be understood as an algorithm to solve the high-dimensional binary median regression problem. We extend the theory of Friedman, Hastie, and Tibshirani (2000) who show that AdaBoost builds an additive logistic regression model via minimizing the "exponential loss". Generalizing the exponential loss function to an asymmetric exponential loss function, we introduce "Asymmetric AdaBoost". While the exisitng (symmetric) AdaBoost penalizes false positive (FP) prediction and false negative (FN) prediction equally, we show that the Asymmetric AdaBoost algorithm penalizes them differently (asymmetrically).