Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 228 - IMS Lawrence D. Brown PhD Student Award Session
Type: Invited
Date/Time: Wednesday, August 11, 2021 : 10:00 AM to 11:50 AM
Sponsor: IMS
Abstract #316839
Title: First-Order Newton-Type Estimator for Distributed Estimation and Inference
Author(s): Yichen Zhang* and Xi Chen and Weidong Liu
Companies: Purdue University and NYU and Shanghai Jiao Tong University
Keywords: distributed inference; stochastic gradient descent; divide-and-conquer; stochastic variance reduced gradient
Abstract:

This paper studies distributed estimation and inference for a general statistical problem with a convex loss that could be non-differentiable. For the purpose of efficient computation, we restrict ourselves to stochastic first-order optimization, which enjoys low per-iteration complexity. To motivate the proposed method, we first investigate the theoretical properties of a straightforward Divide-and-Conquer Stochastic Gradient Descent (DC-SGD) approach. Our theory shows that there is a restriction on the number of machines and this restriction becomes more stringent when the dimension p is large. To overcome this limitation, this paper proposes a new multi-round distributed estimation procedure that approximates the Newton step only using stochastic subgradient. Instead of estimating the population Hessian matrix that usually requires the second-order differentiability of the loss, the proposed First-Order Newton-type Estimator (FONE) is applicable to non-differentiable losses. Our estimator also facilitates the inference for the empirical risk minimizer.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program