Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 55 - Advances in Bayesian Sparse Regression
Type: Contributed
Date/Time: Monday, August 3, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #309896
Title: The Reciprocal Bayesian LASSO
Author(s): Himel Mallick* and Rahim Alhamzawi and Vladimir Svetnik
Companies: Merck Research Laboratories and University of Al-Qadisiyah and Merck & Co., Inc.
Keywords: Bayesian Regularization; Variable Selection; Reciprocal LASSO; Nonlocal Priors; MCMC; Penalized Regression
Abstract:

A reciprocal LASSO (rLASSO) regularization employs a decreasing penalty function as opposed to conventional penalization methods that use increasing penalties on the coefficients, leading to stronger parsimony and superior model selection relative to traditional shrinkage methods. Here we consider a fully Bayesian formulation of the rLASSO problem, which is based on the observation that the rLASSO estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters are assigned independent inverse Laplace priors. Bayesian inference from this posterior is possible using an expanded hierarchy motivated by a scale mixture of double Pareto or truncated normal distributions. On simulated and real datasets, we show that the Bayesian formulation outperforms its classical cousin in estimation, prediction, and variable selection across a wide range of scenarios while offering the advantage of posterior inference. Finally, we discuss other variants of this new approach and provide a unified framework for variable selection using flexible reciprocal penalties. All methods described in this paper are publicly available.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program