Online Program Home
My Program

Abstract Details

Activity Number: 572 - Sparsity and Variable Selection in Posterior Inference
Type: Contributed
Date/Time: Wednesday, July 31, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Nonparametric Statistics
Abstract #307280 Presentation
Title: A Random Neighborhood Method for Bayesian Semiparametric Conditional Density Estimation
Author(s): Nong Shang*
Companies: CDC
Keywords: Random neighborhood; Gibbs sampler; Non-parametric regression; Conditional density; Local partial sample

We consider the problem of estimating conditional density that shares a common parametric form but with probably many parameters, for example, a linear mixture of many distributions. While numerous regression methods have been developed to model a few parameters, or the overall distribution of the conditional density functions, it is often difficult to extend these methods to model so many parameters simultaneously, especially when a non-parametric approach is preferred and when a Bayesian approach becomes necessary. Through introducing a local likelihood function that pivots on the overall unconditional density function, we extend the nearest neighborhood method to develop a general approach to estimate the conditional density functions. With a Kullback-Leibler divergence interpretation, the pivoted local likelihood function can be used to select the best size of the nearest neighborhood with no need to utilize cross-validation or other resampling techniques. We further extend the idea to allow the neighborhood membership be random so that a full Bayesian framework can be developed. Extensive simulations are conducted to illustrate advantages of this innovative approach.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program