Abstract:
|
We propose a two step algorithm based on L1/L0 regularization for the detection and estimation of parameters of a high dimensional change point regression model and provide the corresponding rates of convergence for the change point as well as the regression parameter estimates. Importantly, the computational cost of our estimator is only 2Lasso(n,p), where Lasso(n,p) represents the computational burden of one Lasso optimization in a model of size (n,p). In comparison, existing grid search based approaches to this problem require a computational cost of at least nLasso(n,p) optimizations. Additionally, the proposed method is shown to be able to consistently detect the case of `no change', i.e., where no finite change point exists in the model. We allow the true change point parameter to possibly move to the boundaries of its parametric space, and the jump size to possibly diverge as n increases. We then characterize the corresponding effects on the rates of convergence of the change point and regression estimates. Simulations are performed to empirically evaluate performance of the proposed estimators.
|