We consider a distributed estimation and inference for a general statistical problem with a convex loss that can be non-differentiable. We develop a new multi-round distributed estimation procedure that approximates the Newton step only using mini-batch stochastic subgradient. The key component in our method is the proposal of a computationally efficient estimator of the multiplication of an inverse population Hessian matrix with a given vector. Instead of estimating the Hessian matrix that usually requires the second-order differentiability of the loss, our estimator, called First-order Newton-type Estimator (FONE), directly estimates the vector of interest as a whole and is applicable to non-differentiable losses. Moreover, our method kills two birds with one stone. In the limiting distribution result, it turns out that the key term in the limiting covariance also has a similar form of the multiplication of the inverse population Hessian matrix and a given vector, which can also be estimated by FONE. The proposed FONE has many other potential applications to statistical estimation problems such as linear discriminant analysis (LDA).