Keywords: non-parametric regression, local adaptivity, manifold adaptivity, total variation, fused lasso
We extend the fused lasso graphs to general nonparametric regression. The resulting approach, which we call the K-nearest neighbors (K-NN) fused lasso, involves (i) computing the K-NN graph of the design points; and (ii) performing the fused lasso over this K-NN graph. We show that this procedure has several theoretical advantages over competing approaches: specifically, it inherits local adaptivity from its connection to the fused lasso, and it inherits manifold adaptivity from its connection to the K-NN approach. Finally, I will talk about some recent developments of the fused lasso for graphon estimation.