Abstract:
|
Most of well-known Bayesian inference methods for functional data by smoothing using a limit of proper priors. Some priors have to be empirically estimated during the procedure because of some constrains, which implies that the model uncertainty of these priors is not taken into account in the posterior distribution of the other parameters. For example, the estimator of the constrained eigenfunctions is not incorporated into Bayesian inference. We propose a fully Bayesian method in the functional principal component data analysis, where the Bayesian inference is based on a variational algorithm. The Langevin-Bingham matrix variate distribution will be used in Gibbs sampler for sampling the eigenfunctions. To determine the number of the components, the reversible jump MCMC algorithm is implemented. The proposed generative model allows for noisy and sparse observations of curves. The fully Bayesian approach enables us to obtain much richer inferences. The effective of the proposed method is illustrated in simulations and a real application from opioid dependence treatments.
|