Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 355 - Advanced Bayesian Topics (Part 4)
Type: Contributed
Date/Time: Thursday, August 12, 2021 : 10:00 AM to 11:50 AM
Sponsor: Section on Bayesian Statistical Science
Abstract #319074
Title: Non-Smooth Bayesian Optimization in Tuning Problems
Author(s): Hengrui Luo* and Yonghyun Cho and James Demmel and Xiaoye Li and Yang Liu
Companies: Lawrence Berkeley National Laboratory and University of California, Berkeley and University of California, Berkeley and Lawrence Berkeley National Laboratory and Lawrence Berkeley National Laboratory
Keywords: Bayesian optimization; clustering; surrogate modeling; additive Gaussian models
Abstract:

Building surrogate models is one common approach when we attempt to learn unknown black-box functions. Bayesian optimization provides a framework which allows us to build surrogate models based on sequential samples drawn from the function and find the optimum. Tuning algorithmic parameters to optimize the performance of large, complicated, black-box application codes is a specific important application, which aims at finding optima of black-box functions. Within the Bayesian optimization framework, the Gaussian process model produces smooth or continuous sample paths. However, the black-box function in the tuning problem is often non-smooth. This difficulty in tuning problem is worsened by the fact that we usually have limited sequential samples from the black-box function. Motivated by these issues encountered in tuning, we propose a novel additive Gaussian process model called clustered Gaussian process (cGP), where the additive components are induced by clustering. By using this surrogate model, we want to capture the non-smoothness caused by the change of regime in the black-box function.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program