Abstract:
|
Retrospective estimation of one-dimensional change points is a well-studied problem in statistics but the study of multidimensional change-points is relatively less well understood. One particular scenario is the change-plane problem in which a hyperplane in multidimensional Euclidean space separates two statistical regimes. In this presentation, we showcase some new asymptotic results for retrospective change-plane estimation in growing dimensions (including the high dimensional situation where dimension exceeds sample size). A striking feature is that the traditional L_ 2 loss is no longer rate optimal when dimension increases with sample size, especially in the presence of heavy-tailed errors, whereas optimizing an L_1 loss delivers near minimax optimal rates. We also demonstrate that even in one dimension, L_1 loss based optimization is beneficial under heavy-tailed errors in terms of the asymptotic variance of the estimator. ?
|