Online Program Home
  My Program

Abstract Details

Activity Number: 612 - New Challenges in High-Dimensional Statistical Inference
Type: Topic Contributed
Date/Time: Thursday, August 3, 2017 : 8:30 AM to 10:20 AM
Sponsor: IMS
Abstract #324529
Title: Pairwise Difference Estimation of High Dimensional Partially Linear Model
Author(s): Fang Han* and Zhao Ren and Yuxin Zhu
Companies: University of Washington and University of Pittsburg and Johns Hopkins University
Keywords: partially linear model ; pairwise difference approach ; jump function ; minimum sample size requirement ; heavy-tailed noise
Abstract:

Consider the partially linear model (PLM) with random design: Y=X^T\beta^*+g(W)+u, where g(.) is an unknown nonlinear real function, X is p-dimensional, W is one-dimensional, and \beta^* is s-sparse. Our aim is to efficiently estimate \beta^* based on n i.i.d. observations of (Y,X,W) with n< p. For this, the best theoretical results to date rest on the following three assumptions: (i) g(.) belongs to some smooth enough function class G (e.g., globally Lipschitz); (ii) s^2\log p/n\to 0; and (iii) u is subgaussian. These assumptions are related to long-standing problems in nonparametric statistics, and are arguably difficult to relax. In this paper, Honore and Powell's pairwise difference approach (plus a lasso-type penalty) is shown to attain rate-optimality with all three assumptions relaxed. Particularly, rate-optimality proves attainable in a certain regime when g(.) is even a sharply discontinuous piecewise Holder function, a result new to both statistics and econometrics communities. The proof rests on a general method for determining the estimation accuracy of a "contaminated" M-estimator, new U-statistics tools, and function perturbation theory.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association