Online Program Home
  My Program

Abstract Details

Activity Number: 175 - Contributed Poster Presentations: Section on Statistical Learning and Data Science
Type: Contributed
Date/Time: Monday, July 31, 2017 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #324398
Title: Optimal Variable Selection in Regression Models
Author(s): Jie Ding* and Vahid Tarokh and Yuhong Yang
Companies: Harvard and Harvard and University of Minnesota
Keywords: Model Selection
Abstract:

We introduce a new criterion for  variable selection in regression models, and show its optimality in terms of both loss and risk under reasonable assumptions.   The key idea is to impose a penalty that is heavy for models with small dimensions and lighter for those with larger dimensions. In contrast to the state-of-art model selection criteria such as the $C_p$ method, delete-1 or delete-$d$ cross-validation, Akaike information criterion,  Bayesian information criterion, the proposed method is able to achieve asymptotic loss and risk efficiency in both parametric and nonparametric regression settings, giving new insights on the reconciliation of two types of classical criteria with different asymptotic behaviors. Adaptivity and wide applicability of the new criterion are demonstrated by several numerical experiments.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association