Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 226 - Contributed Poster Presentations: SSC (Statistical Society of Canada)
Type: Contributed
Date/Time: Tuesday, August 4, 2020 : 10:00 AM to 2:00 PM
Sponsor: SSC (Statistical Society of Canada)
Abstract #313492
Title: Improved Bounds on Minimax Regret Under Logarithmic Loss via Self-Concordance
Author(s): Blair Bilodeau* and Dylan J Foster and Daniel M Roy
Companies: University of Toronto and Massachusetts Institute of Technology and University of Toronto
Keywords: Forecasting; Logarithmic loss; Online Learning; Minimax regret; Sequential prediction
Abstract:

We study the minimax regret for the classical problem of sequential probabilistic forecasting under logarithmic loss. At each time step, the player observes some covariates, forecasts a probability distribution, and incurs the negative log-likelihood of the observed outcome under the forecasted distribution. We allow the observations to be chosen adversarially rather than from a fixed distribution, so it is standard to study the regret; i.e., the difference between the player’s total loss and the total loss of the single best forecaster in some arbitrary, potentially nonparametric class of “experts”. This problem requires more refined techniques than, say, square loss or classification loss, as both the loss values and their gradient may now be unbounded.

Exploiting the curvature of logarithmic loss in a novel way, we obtain upper bounds that depend on the complexity of the expert class through its (sequential) metric entropy, and that cannot be improved without additional assumptions on the expert class. As an application of our techniques, we resolve the minimax regret for nonparametric Lipschitz classes of experts.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program