Online Program Home
My Program

Abstract Details

Activity Number: 288 - New Insights from Classical Wisdom—honoring Lawrence D. Brown’s Contributions to Graduate Student Education
Type: Topic Contributed
Date/Time: Tuesday, July 30, 2019 : 8:30 AM to 10:20 AM
Sponsor: IMS
Abstract #304487 Presentation
Title: REGRESSION ADJUSTMENT in COMPLETELY RANDOMIZED EXPERIMENTS with a DIVERGING NUMBER of COVARIATES
Author(s): Lihua Lei* and Peng Ding
Companies: UC Berkeley and University of California, Berkeley
Keywords: Neyman-Rubin model; ordinary least squares; average treatment effect; completely randomized experiment
Abstract:

Extending R. A. Fisher and D. A. Freedman’s results on the analysis of covariance, Lin [2013] proposed an ordinary least squares adjusted estimator of the average treatment effect in completely randomized experiments. In this article we further study its statistical properties under the potential outcomes model in the asymptotic regimes allowing for a diverging number of covariates. We show that Lin [2013]’s estimator is consistent when $\kappa \log p\rightarrow 0$ and asymptotically normal when $\kappa p \rightarrow 0$ under mild moment conditions, where $\kappa$ is the maximum leverage score of the covariate matrix. In addition, we propose a bias-corrected estimator that is consistent when $\kappa \log p\rightarrow 0$ and is asymptotically normal, with the same variance in the fixed-p regime, when $\kappa^2 p\log p \rightarrow 0$. In the favorable case where leverage scores are all close together, the latter condition reduces to $p = o(n^{2/3}/(\log n)^{1/3})$. Similar to Lin [2013], our results hold without any model specification. This work is closed related to Professor Brown's work on sampling theory, inference on misspecified model and causal inference.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program