Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 217 - High-Fidelity Gaussian Process Surrogate Modeling: Deep and Shallow
Type: Invited
Date/Time: Wednesday, August 11, 2021 : 10:00 AM to 11:50 AM
Sponsor: Section on Physical and Engineering Sciences
Abstract #316759
Title: Active Learning for Deep Gaussian Process Surrogates
Author(s): Annie Sauer*
Companies: Virginia Tech
Keywords: sequential design; kriging; deep learning; elliptical slice sampling; integrated mean-squared prediction error; computer model
Abstract:

Deep Gaussian processes (DGPs) are increasingly popular as predictive models in machine learning (ML) for their non-stationary flexibility and ability to cope with abrupt regime changes in the training data. Here we explore DGPs as surrogates for computer simulation experiments. In particular, we exploit the DGPs automatic warping of the input space and full uncertainty quantification (UQ) through Bayesian posterior inference to develop active learning (AL) strategies that distribute runs non-uniformly in the input space. Building up the design sequentially allows smaller training sets, limiting both expensive evaluation of the simulator code and mitigating cubic costs of DGP inference. In order to ensure that AL criteria synthesize all relevant posterior mean and variance elements, we depart from the ML preference for thrifty variational inference of DGPs and instead promote full MCMC. When training data sizes are kept small through careful acquisition, and with parsimonious layout of latent layers, the framework can be both effective and computationally tractable. We provide an open source implementation in the “deepgp” package on CRAN.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program