Abstract:
|
In a broad variety of settings, prior information takes the form of parameter restrictions. However, the usual Bayesian strategy of directly placing a prior measure on the constrained space and then conducting posterior computation is often intractable. An alternative is to initially conduct computation for an unconstrained or less constrained posterior, and then project draws from this initial posterior to the constrained space through a minimal distance mapping. This approach has been successful in monotone function estimation but has not been considered in broader settings. In this article, we develop a general theory to justify posterior projections in standard Borel spaces. For tractability, we choose the constrained space to be a closed, convex subset of the original space. We then provide a general formulation of the projected posterior and show its validity on the constrained space for particular classes of priors and likelihood functions. We also show how asymptotic properties of the unconstrained posterior are transferred to the projected posterior. Posterior projections are then illustrated through multiple examples, both in simulation studies and real data applications.
|