Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 543 - SBSS Student Paper Competition I
Type: Topic Contributed
Date/Time: Thursday, August 6, 2020 : 1:00 PM to 2:50 PM
Sponsor: Section on Bayesian Statistical Science
Abstract #309774
Title: A Hierarchical Expected Improvement Method for Bayesian Optimization
Author(s): Zhehui Chen* and Simon Mak and Jeff Wu
Companies: Georgia Tech and Duke University and ISyE, Georgia Tech
Keywords: Black-box Optimization; Gaussian Process; Global Optimization; Hierarchical Modeling
Abstract:

Expected improvement (EI) is one of the most popular Bayesian optimization (BO) methods, due to its closed-form acquisition function which allows for efficient optimization. However, one key drawback of EI is that it is overly greedy; this results in suboptimal solutions even for large sample sizes. To address this, we propose a new hierarchical EI (HEI) framework, which makes use of a hierarchical Gaussian process model. HEI preserves a closed-form acquisition function, and corrects the over-greediness of EI by encouraging exploration of the optimization space. Under certain prior specifications, we prove the global convergence of HEI over a broad objective function space, and derive global convergence rates under smoothness assumptions on the objective function. We then introduce several hyperparameter estimation methods, which allow HEI to mimic a fully Bayesian optimization procedure while avoiding expensive Markov-chain Monte Carlo sampling. Numerical experiments show the improvement of HEI over existing BO methods, for synthetic functions as well as a semiconductor manufacturing optimization problem.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program