Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 134 - Bayesian Modeling
Type: Contributed
Date/Time: Monday, August 9, 2021 : 1:30 PM to 3:20 PM
Sponsor: Section on Statistical Computing
Abstract #318496
Title: Scalable Bayesian Optimization Using Ordered Conditional Approximations of Gaussian Processes
Author(s): Felix Jimenez* and Matthias Katzfuss
Companies: Texas A&M University and Texas A&M University
Keywords: reinforcement learning; Bayesian Optimization; Gaussian processes; Vecchia approximation; Sparse Inverse Cholesky factor
Abstract:

Bayesian optimization is a technique for optimizing black-box target functions. At the core of Bayesian optimization is a surrogate model that predicts the output of the target function at a previously unseen input to facilitate the selection of promising input values. Gaussian processes (GPs) are a common surrogate model but are known to scale poorly with the number of observations. To address this scaling issue, we propose the use of an ordered conditional GP approximation. Our approximation is well suited for extensions such as the selection of multiple input values in parallel. We showcase the advantages of our approximation relative to existing methods in several numerical comparisons. We end with a discussion on Bayesian optimization in higher dimensions.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program