Online Program Home
My Program

Abstract Details

Activity Number: 28 - SPEED: A Mixture of Topics in Health, Computing, and Imaging
Type: Contributed
Date/Time: Sunday, July 29, 2018 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #328386 Presentation
Title: Ranked Sparsity Methods for Transparent Model Selection
Author(s): Ryan Andrew Peterson* and Joseph Cavanaugh
Companies: University of Iowa and University of Iowa
Keywords: interactions; LASSO; predictive modeling; variable selection
Abstract:

We explore and illustrate the use of ranked sparsity via the LASSO, in combination with a series of penalty-ranking functions, in order to impose a dynamically-structured sparse model selection framework that favors models that are better classified and more transparent. Model selection methods for generalized linear models (GLMs) commonly presume that each potential parameter is equally worthy of entering into the final model. We call this rule "covariate equipoise". However, this assumption does not always hold, especially in the presence of derived variables. For instance, when all possible interactions are considered as candidate predictors, the presumption of covariate equipoise will often produce misclassified and opaque models. We suggest a modeling strategy that requires a stronger level of evidence in order to allow certain variables (e.g. interactions) to be selected in the final model. This ranked sparsity paradigm can be implemented with the LASSO, and has broad applicability; we explore its performance in selecting interaction and polynomial effects for GLM frameworks, and in selecting local and seasonal effects for time-series modeling frameworks.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program