Online Program Home
My Program

Abstract Details

Activity Number: 298 - Model/Variable Selection and Model Evaluation
Type: Contributed
Date/Time: Tuesday, July 30, 2019 : 8:30 AM to 10:20 AM
Sponsor: Biometrics Section
Abstract #305125
Title: Model Confidence Bounds for Variable Selection
Author(s): Yang Li*
Companies: Renmin University of China
Keywords: model selection; model confidence bounds; bootstrap

In this talk, we introduce the concept of model confidence bounds (MCB) for variable selection in the context of nested models. Similarly to the endpoints in the familiar confidence interval for parameter estimation, the MCB identifies two nested models (upper and lower confidence bound models) containing the true model at a given level of confidence. Instead of trusting a single selected model obtained from a given model selection method, the MCB proposes a group of nested models as candidates and the MCB's width and composition enable the practitioner to assess the overall model selection uncertainty. A new graphical tool, the model uncertainty curve (MUC), is introduced to visualize the variability of model selection and to compare different model selection procedures. The MCB methodology is implemented by a fast bootstrap algorithm that is shown to yield the correct asymptotic coverage under rather general conditions. Our Monte Carlo simulations and real data examples confirm the validity and illustrate the advantages of the proposed method.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program