Abstract:
|
We consider the problem of estimating an unknown function, defined on a compact subinterval of the real line. Assume that we have discrete, noisy observations at a finite collection of points. We consider a Bayesian framework, specifically we select a Gaussian process prior on the unknown function. It is well-known that certain Gaussian processes suffer from significant computational cost stemming from the need to invert large matrices and evaluate their determinants. To address this, we use a Gaussian Markov Random Field (GMRF) prior. The Markov Chain Monte Carlo sampler is straight forward to design. We prove that the Metropolis-within-Gibbs sampler for this problem is geometrically ergodic using a drift and minorization condition approach. Moreover, we prove that certain marginal samplers, which can be used to control the smoothness of the estimated function, are also geometrically ergodic. This allows us to place our inference and assess convergence on a more solid ground.
|