Abstract:
|
Bayesian Optimization (BO) aims to optimize costly-to-evaluate functions by selecting a limited number of experiments that each evaluate the function at a specified input. BO has been shown to be a powerful black box optimization approach for many low dimensional applications. Unfortunately, however, scaling BO to higher dimensions is still very challenging and often impractical without making strong assumptions about the underlying function. Moreover, existing works on high dimensional BO almost exclusively consider cases where all the explanatory variables are continuous. In this work, we tackle the problem of high dimensional BO which admits both continuous and categorical variables using the method of kernels. To approach this problem, we assume an unknown additive structure for the function and investigate how to automatically discover it. We propose an efficient greedy algorithm that finds the appropriate decomposition of the solution space and independently optimizes each sub-problem. We evaluate our approach on a number of diverse benchmark problems and show that it produces high-quality results compared to a number of existing existing approaches and natural baselines.
|