Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 319 - SLDS CSpeed 6
Type: Contributed
Date/Time: Wednesday, August 11, 2021 : 3:30 PM to 5:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #318211
Title: Hyperparameter Optimization of Deep Neural Networks with Applications to Medical Device Manufacturing
Author(s): Gautham Sunder* and Christopher Nachtsheim and Thomas Albrecht
Companies: Carlson School of Management and Carlson School of Management and Boston Scientific
Keywords: Hyperparameter Optimization; Deep Neural Networks; Noisy Computer Experiments; Bayesian Optimization; Response Surface Optimization
Abstract:

Bayesian Optimization (BO), a class of Response Surface Optimization (RSO) method for nonlinear functions, is a commonly adopted strategy for Hyperparameter optimization (HO) of Deep Neural Networks. In this study we empirically illustrate that the validation loss in HO problems, in some cases, can be well approximated by a quadratic function. When this is the case, Classical RSO methods are demonstrably more efficient in estimating the optimal response. In this study we propose a batch sequential RSO strategy, a new class of starting designs called Compromise Uniform designs, a compound optimal design between D-optimal and Uniform designs, is used for estimating the complexity of the unknown response function. For the estimated complexity the better suited method between BO and Adaptive-RSO, a novel exploration-exploitation based RSO strategy proposed in the study, is adopted in the subsequent experiments. Our simulation studies on synthetic test functions of varying complexity and noise levels illustrates that the proposed RSO strategy is highly efficient and is superior to BO when the true function is quadratic and the results are comparable to BO when the function is nonlinear.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program