Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 498 - Modern Machine Learning
Type: Contributed
Date/Time: Thursday, August 6, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #309794
Title: Uniform Regret Bounds for Quantile Regression Tree Process in Offline and Online Settings
Author(s): Fei Fang* and Alexandre Belloni
Companies: Duke University and Duke University
Keywords: Quantile regression tree; Different quantile levels; Uniform regret bound; Online learning
Abstract:

Quantile regression analysis is commonly used in many applications, e.g., economics and health care, for more complete understanding on conditional distribution. This work focuses on the learnability of quantile regression trees uniformly across different quantile losses in different settings. Our main contribution is to obtain bounds that simultaneously account for a continuum of regrets generated by each quantile level. Based on empirical process theory in i.i.d. and sequential cases, we derive uniform regret bounds across different quantile levels (i) for the empirical minimizer in i.i.d. case, (ii) in minimax sense and (iii) for the exponential weights algorithm (EWA). (ii) and (iii) are both for time dependent data and under a mild assumption on data distribution. It turns out that the bound on uniform minimax regret is sub-linear and with poly-logarithmic factor in sample size. The uniform regret bound derived from EWA is similar to that of (ii).


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program