Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 362 - Contributed Poster Presentations: Section on Physical and Engineering Sciences
Type: Contributed
Date/Time: Wednesday, August 5, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Physical and Engineering Sciences
Abstract #313628
Title: Adversarial Surrogacy
Author(s): Nathan Wycoff* and Robert Gramacy
Companies: Virginia Tech and Virginia Tech
Keywords: Deep Learning; Neural Network; Surrogate Modeling; Stochastic Process; Physical Constraints; Bayesian Statistics
Abstract:

The arguable gold standard of black-box surrogate modeling at present is the Gaussian Process (GP), in large part due to its excellent uncertainty quantification properties. GP-based surrogate models have been greatly successful in performing a wide range of tasks such as optimization, sensitivity analysis, and model calibration of expensive simulators. However, the conjugate GP model assumes, of course, a normal error structure, making it unrealistic for certain applications. The past has seen extensive work in computing with stochastic processes with non-Gaussian marginal distributions, though these are typically still unimodal. Beyond multimodal error structures, modern kernel-based stochastic process inference can struggle with high dimensional inputs, multidimensional, complexly dependent outputs, and enforcement of physical constraints. We show how Generative Adversarial Networks (GANs), a deep learning framework best known for producing artificial natural images, fill all these gaps. However, we also note that off-the-shelf GANs fail to retain the GPs most important property of uncertainty quantification, which we remedy by enforcing a stochastic process prior.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program