Online Program Home
  My Program

Abstract Details

Activity Number: 147 - Bayesian Hierarchical Modeling: Obtaining Optimal Performance Through Theoretical Understanding
Type: Invited
Date/Time: Monday, July 31, 2017 : 10:30 AM to 12:20 PM
Sponsor: IMS
Abstract #322225 View Presentation
Title: Large-P Small-N Nonparametric Regression and Additive-Interactive Response Functions
Author(s): Surya T Tokdar*
Companies: Duke University
Keywords: Nonparametric regression ; Smoothing ; Large-p small-n ; Minimax L2 risk ; Asymptotics ; Gaussian process regression
Abstract:

Smoothing based prediction faces many difficulties when the predictor dimension p is large but sample size n is limited, a situation that is often referred to as the large-p small-n prediction problem. A fundamental challenge is that not all smooth functions can be consistently estimated from noisy data if p grows much faster than n. We demonstrate that additive-interactive function spaces offer the ideal modeling framework for consistent estimation. In the additive-interactive model, the mean response function is taken to be the sum total of a modest number of smooth functions each involving a small number of interacting predictors. We show that the minimax L2 estimation rate over such a function space decays to zero under reasonable assumptions on the relative growth rates of n, p and the number of truly active predictors. The additive-interactive assumption naturally leads to a hierarchical Bayesian model for the response function. We introduce a Bayesian estimation method for this model by utilizing an "additive Gaussian process prior" on the model space. We investigate adaptive asymptotic efficiency of this method in prediction under L2 loss and recovery of true predictors.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association