Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 114 - Time Series Methods and Applications
Type: Contributed
Date/Time: Monday, August 8, 2022 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #323161
Title: Near-Optimal Inference in Adaptive Linear Regression
Author(s): Koulik Khamaru* and Lester Mackey and Martin J. Wainwright and Yash Deshpande
Companies: University of California, Berkeley and Microsoft Research New England and University of California, Berkeley and The Voleon group
Keywords: Adaptive linear regression; Online debiasing; Minimax lower bound; Asymptotic normality; Multi-armed bandits; Autoregressive time series.
Abstract:

When data is collected in an adaptive manner, even simple methods like ordinary least squares can exhibit non-normal asymptotic behavior. As an undesirable consequence, hypothesis tests and confidence intervals based on asymptotic normality can lead to erroneous results. We propose an online debiasing estimator to correct these distributional anomalies in least squares estimation. Our proposed method takes advantage of the covariance structure present in the dataset and provides sharper estimates in directions for which more information has accrued. We establish an asymptotic normality property for our proposed online debiasing estimator under mild conditions on the data collection process, and provide asymptotically exact confidence intervals. We additionally prove a minimax lower bound for the adaptive linear regression problem, thereby providing a baseline by which to compare estimators. There are various conditions under which our proposed estimator achieves the minimax lower bound up to logarithmic factors. We demonstrate the usefulness of our theory via applications to multi-armed bandit, autoregressive time series estimation, and active learning with exploration.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program