Abstract:
|
We consider the problem of estimating conditional density that shares a common parametric form but with probably many parameters, for example, a linear mixture of many distributions. While numerous regression methods have been developed to model a few parameters, or the overall distribution of the conditional density functions, it is often difficult to extend these methods to model so many parameters simultaneously, especially when a non-parametric approach is preferred and when a Bayesian approach becomes necessary. Through introducing a local likelihood function that pivots on the overall unconditional density function, we extend the nearest neighborhood method to develop a general approach to estimate the conditional density functions. With a Kullback-Leibler divergence interpretation, the pivoted local likelihood function can be used to select the best size of the nearest neighborhood with no need to utilize cross-validation or other resampling techniques. We further extend the idea to allow the neighborhood membership be random so that a full Bayesian framework can be developed. Extensive simulations are conducted to illustrate advantages of this innovative approach.
|