The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Online Program Home
Abstract Details
Activity Number:
|
322
|
Type:
|
Topic Contributed
|
Date/Time:
|
Tuesday, July 31, 2012 : 10:30 AM to 12:20 PM
|
Sponsor:
|
Section on Statistical Computing
|
Abstract - #304692 |
Title:
|
Global and Local Functional Sparsity in Nonparametric Regression
|
Author(s):
|
Haonan Wang*+ and Yan Tu and Bo Kai
|
Companies:
|
Colorado State University and Colorado State University and College of Charleston
|
Address:
|
Department of Statistics, Fort Collins, CO, 80523-0001, United States
|
Keywords:
|
model selection ;
smoothing ;
sparsity
|
Abstract:
|
In this talk, we consider the problem of estimation and selection in nonparametric regression. The notion of functional sparsity is introduced as a generalization of parameter sparsity in classical parametric regression model. In particular, two different types of sparsity are of interest, including both global sparsity and local sparsity. The goal is to produce a sparse estimate which assigns zero values over regions where the true underlying function is zero. Most classical smoothing techniques yield consistent estimates with no sparsity. Here, a penalized approach is proposed for simultaneous functional estimation and model selection. Asymptotic properties of the procedure, including both consistency in estimation and sparsity in model selection, are established. This approach is also implemented to varying coefficient model and functional dynamic model. The methodology is illustrated through simulation studies and real data analysis.
|
The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.
Back to the full JSM 2012 program
|
2012 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Continuing Education program, please contact the Education Department.