Abstract:

We consider a distributed estimation and inference for a general statistical problem with a convex loss that can be nondifferentiable. We develop a new multiround distributed estimation procedure that approximates the Newton step only using minibatch stochastic subgradient. The key component in our method is the proposal of a computationally efficient estimator of the multiplication of an inverse population Hessian matrix with a given vector. Instead of estimating the Hessian matrix that usually requires the secondorder differentiability of the loss, our estimator, called Firstorder Newtontype Estimator (FONE), directly estimates the vector of interest as a whole and is applicable to nondifferentiable losses. Moreover, our method kills two birds with one stone. In the limiting distribution result, it turns out that the key term in the limiting covariance also has a similar form of the multiplication of the inverse population Hessian matrix and a given vector, which can also be estimated by FONE. The proposed FONE has many other potential applications to statistical estimation problems such as linear discriminant analysis (LDA).
