Online Program Home
My Program

Abstract Details

Activity Number: 350 - New Methods for Time Series and Longitudinal Data
Type: Contributed
Date/Time: Tuesday, July 30, 2019 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #302941 Presentation
Title: An Efficient Two Step Algorithm for High-Dimensional Change Point Regression Models Without Grid Search
Author(s): Abhishek Kaul* and Venkata K Jandhyala and Stergios B Fotopoulos
Companies: Washington State University and Washington State University and Washington State University
Keywords: Change point regression; High dimensions; L1/L0 regularization; Rate of convergence; Two phase regression

We propose a two step algorithm based on L1/L0 regularization for the detection and estimation of parameters of a high dimensional change point regression model and provide the corresponding rates of convergence for the change point as well as the regression parameter estimates. Importantly, the computational cost of our estimator is only 2Lasso(n,p), where Lasso(n,p) represents the computational burden of one Lasso optimization in a model of size (n,p). In comparison, existing grid search based approaches to this problem require a computational cost of at least nLasso(n,p) optimizations. Additionally, the proposed method is shown to be able to consistently detect the case of `no change', i.e., where no fi nite change point exists in the model. We allow the true change point parameter to possibly move to the boundaries of its parametric space, and the jump size to possibly diverge as n increases. We then characterize the corresponding effects on the rates of convergence of the change point and regression estimates. Simulations are performed to empirically evaluate performance of the proposed estimators.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program