Abstract:
|
We propose a computational framework to simultaneously control statistical error and algorithmic complexity when fitting sparse regression models with heavy-tailed and/or asymmetric errors. Statistically, we show that if the loss function is carefully calibrated to fit the noise level and the intrinsic structure of the model, the ill-effects of outliers caused by the noise can be removed, or at least dampened. Computationally, we propose a two-stage (contraction and tightening) procedure with controlled algorithmic complexity. The first stage solves a convex problem to obtain a coarse initial estimator, which is further refined in the second stage by iteratively solving a sequence of convex programs. Theoretically we show that the resulting estimator from our algorithm achieves the optimal rate of convergence and oracle properties.
|