Online Program Home
My Program

Abstract Details

Activity Number: 220 - Uncertainty Quantification for Stochastic Optimization Methods in Machine Learning
Type: Invited
Date/Time: Monday, July 29, 2019 : 2:00 PM to 3:50 PM
Sponsor: IMS
Abstract #300016
Title: First-Order Newton-Type Estimator for Distributed Estimation and Inference
Author(s): Xi Chen* and Weidong Liu and Yichen Zhang
Companies: New York University and Shanghai Jiaotong University and New York University
Keywords: distributed inference; stochastic gradient descent; Newton step; limiting distribution; non-differentiable

We consider a distributed estimation and inference for a general statistical problem with a convex loss that can be non-differentiable. We develop a new multi-round distributed estimation procedure that approximates the Newton step only using mini-batch stochastic subgradient. The key component in our method is the proposal of a computationally efficient estimator of the multiplication of an inverse population Hessian matrix with a given vector. Instead of estimating the Hessian matrix that usually requires the second-order differentiability of the loss, our estimator, called First-order Newton-type Estimator (FONE), directly estimates the vector of interest as a whole and is applicable to non-differentiable losses. Moreover, our method kills two birds with one stone. In the limiting distribution result, it turns out that the key term in the limiting covariance also has a similar form of the multiplication of the inverse population Hessian matrix and a given vector, which can also be estimated by FONE. The proposed FONE has many other potential applications to statistical estimation problems such as linear discriminant analysis (LDA).

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program