Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 386 - Nonparametric Modeling II
Type: Contributed
Date/Time: Thursday, August 12, 2021 : 12:00 PM to 1:50 PM
Sponsor: Section on Nonparametric Statistics
Abstract #317839
Title: Adaptive Estimation in High-Dimensional Additive Models with Multi-Resolution Group Lasso
Author(s): Yisha Yao* and Cun-Hui Zhang
Companies: Department of Statistics, Rutgers-New Brunswick and Rutgers University
Keywords: Additive model; Adaptive estimation; Sobolev space; Model selection; RKHS; High-dimensionality
Abstract:

In additive models with many nonparametric components, a number of regularized estimators have been proposed and proven to attain various error bounds under different combinations of sparsity and fixed smoothness conditions. Some of these error bounds match minimax rates in the corresponding settings. Some of the rate minimax methods are non-convex and computationally costly. From these perspectives, the existing solutions to the high-dimensional additive nonparametric regression problem are fragmented. In this paper, we propose a multi-resolution group Lasso (MR-GL) method in a unified approach to simultaneously achieve or improve existing error bounds and provide new ones without the knowledge of the level of sparsity or the degree of smoothness of the unknown functions. Such adaptive convergence rates are established when a prediction factor can be treated as a constant. Furthermore, we prove that the prediction factor, which can be bounded in terms of a restricted eigenvalue or a compatibility coefficient, can be indeed treated as a constant for random designs under a nearly optimal sample size condition.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program