Keywords: Convex regression, nonparametric, ADMM, optimization, multivariate shape restrictions
We consider the nonparametric problem of estimating a convex regression function in the multivariate case. Our focus is on computation, and we present a method based on the alternating directions method (ADMM). We present a formulation that includes regularization of the 2-norms of the subgradients to control generalization error. The resulting objective is minimized via a three-operator splitting algorithm, and the proposed method exploits available strong convexity, leading to efficient updates. Our algorithm enjoys primal and dual convergence guarantees, in contrast with many ADMM methods applied to objectives that decompose into more than two blocks. The merits of this approach are showcased empirically on real and simulated data, demonstrating scalability to high dimensions and tens of thousands of observations.