Online Program Home
My Program

Abstract Details

Activity Number: 657 - Bayesian and Empirical Bayes
Type: Contributed
Date/Time: Thursday, August 1, 2019 : 10:30 AM to 12:20 PM
Sponsor: IMS
Abstract #306536
Title: Posterior Inference Under Adaptive Penalization for Quantile Regression
Author(s): Yuanzhi Li* and Xuming He
Companies: University of Michigan and University of Michigan
Keywords: Quantile regression; Pseudo-Bayesian; Penalized method; Asymmetric Laplace distribution; Diverging covariates

Quantile regression (QR) is a useful tool in accommodating heterogeneity in data. Under the Bayesian computation framework, it is common to adopt the Asymmetric Laplace working likelihood for QR. It is recognized that the posterior variance will (asymptotically) match with the sampling variance after a simple sandwich-form adjustment. In this talk, we investigate this pseudo-bayesian method under a broad class of adaptive penalties. Two special cases are the Smoothly Clipped Absolute Deviation (SCAD) and Adaptive Lasso penalty. By viewing this penalized pseudo-likelihood from a Bayesian perspective, we develop a valid inferential procedure, assuming the true model is sparse. Although the posterior does not provide a sparse solution, we show that the posterior inference achieves second-order asymptotic efficiency for those zero coefficients. For the non-zero coefficients,  oracle efficiency is achieved. We further consider the scenario where the number of covariates grows with the sample size. Under this scenario, we assume a moving parameter regime where some of the true coefficients are small and vanishing toward 0 at different rates as functions of the sample size. 

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program