Abstract:
|
It is well known that the ordinary least square (OLS) estimator provides efficient estimates for linear regression with normal errors, yet it is highly sensitive to outliers or heavy-tailed errors. To account for the robustness while preserve asymptotic efficiency, we propose a new class of likelihood function, called Tangent Likelihood function, that can be used to obtain robust estimates, termed as Maximum Tangent Likelihood Estimator (MtLE). We show that the MtLE is root-n consistent and asymptotically normally distributed. We also prove that it can achieve the highest asymptotic breakdown point of 1/2. Furthermore, we consider robust variable selection based on our proposed tangent likelihood function and Lasso type penalty, called MtLE-Lasso. The proposed MtLE-Lasso can perform robust estimation and variable selection simultaneously and consistently in linear regression framework. We show that, under mile regularity conditions, MtLE-Lasso enjoys oracle property. We demonstrate the performance of MtLE through several simulation studies as well as real data examples. Finally, we extend our work from the fixed dimensional predictor space to a diverging number of dimensions (p??).
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.