Abstract:
|
The multi-output regression aims to exploit dependencies between multiple output variables and predict them simultaneously. In the community of machine learning, the vector-valued kernel method or multi-output Gaussian process (GP) is usually employed to solve the multi-output regression problems. However, the efficiency of parameter estimation is seldom investigated and the computation could be intractable for high-dimensional outputs. From a Bayesian perspective, our work (a) relaxes the common assumption that all the observations associated with different outputs are evaluated at the same inputs; (b) proposes a vector-valued GP which has a separable kernel function and shows that this GP is equivalent to the solution to a vector-valued smoothing spline problem; (c) puts a novel prior on the covariance matrix of outputs and investigates posterior priority; (d) introduces an efficient Monte Carlo Markov chain method to solving this problem. At last, we will demonstrate the performance of our methods both on simulated data and real data and make comparisons with other existing methods.
|