Abstract:
|
Trend filtering (Kim et al, 2009; Tibshirani, 2014) is motivated by applications in time series with heterogeneous smoothness, but its applications are restricted to offline estimation tasks. The talk describes a recent sequence of work that enable online predicting (or forecasting) of future observations in adversarially-chosen (kth order) TV-bounded sequences with only noisy observations from the past. We characterize the minimax forecasting rates in all parameter regimes and describe two family of algorithms that achieve these rates without knowing the radius of the TV-ball. The first family is based on a marriage of no-regret online learning and wavelets-based restarting rules. The second family is based on a highly nontrivial reduction to strongly adaptive online learning. The second approach particularly allows us to substantially strengthen the theoretical guarantees and generalize beyond square losses and Gaussian noise. In a nutshell, our algorithm optimally competes with the **omniscient** trend filtering estimator with **the best tuning parameter** on each individual sequence in the **hindsight** with no stochastic assumptions what-so-ever.
|