Online Program Home
My Program

Abstract Details

Activity Number: 531 - SPEED: Statistical Computing: Methods, Implementation, and Application, Part 2
Type: Contributed
Date/Time: Wednesday, July 31, 2019 : 11:35 AM to 12:20 PM
Sponsor: Section on Statistical Computing
Abstract #307951
Title: Embarrassingly Parallel Inference for Gaussian Processes
Author(s): Michael Minyi Zhang* and Sinead Williamson
Companies: Princeton University and UT Austin
Keywords: Gaussian process; parallel inference; Bayesian non-parametrics
Abstract:

Training Gaussian process-based models typically involves an $ O(N^3)$ computational bottleneck due to inverting the covariance matrix. Popular methods for overcoming this matrix inversion problem cannot adequately model all types of latent functions, and are often not parallelizable. However, judicious choice of model structure can ameliorate this problem. A mixture-of-experts model that uses a mixture of $K$ Gaussian processes offers modeling flexibility and opportunities for scalable inference. Our embarassingly parallel algorithm combines low-dimensional matrix inversions with importance sampling to yield a flexible, scalable mixture-of-experts model that offers comparable performance to Gaussian process regression at a much lower computational cost.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program