Online Program Home
My Program

Abstract Details

Activity Number: 82
Type: Contributed
Date/Time: Sunday, July 31, 2016 : 4:00 PM to 5:50 PM
Sponsor: IMS
Abstract #318571
Title: Interquantile Shrinkage in Additive Models
Author(s): Zengyan Fan* and Heng Lian
Companies: Nanyang Technological University and University of New South Wales
Keywords: Additive models ; Fused adaptive group LASSO ; Interquantile shrinkage ; Quantile regression
Abstract:

In this paper, we investigate the commonality of nonparametric component functions among different quantile levels in additive regression models. We propose two fused adaptive group LASSO penalties to shrink the difference of functions between neighboring quantile levels. The proposed methodology is able to simultaneously estimate the nonparametric functions and identify the quantile regions where functions are unvarying, and thus is expected to perform better than standard additive quantile regression when there exists a region of quantile levels on which the functions are unvarying. Under some regularity conditions, the proposed penalized estimators can theoretically achieve the optimal rate of convergence and identify the true varying/unvarying regions consistently. Simulation studies and a real data application show that the proposed methods yield good numerical results.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

 
 
Copyright © American Statistical Association