137
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Hybrid Parameter Search and Dynamic Model Selection for Mixed-Variable Bayesian Optimization

ORCID Icon, , , & ORCID Icon
Received 30 Oct 2022, Accepted 15 Jan 2024, Published online: 08 Mar 2024
 

Abstract

This article presents a new type of hybrid model for Bayesian optimization (BO) adept at managing mixed variables, encompassing both quantitative (continuous and integer) and qualitative (categorical) types. Our proposed new hybrid models (named hybridM) merge the Monte Carlo Tree Search structure (MCTS) for categorical variables with Gaussian Processes (GP) for continuous ones. hybridM leverages the upper confidence bound tree search (UCTS) for MCTS strategy, showcasing the tree architecture’s integration into Bayesian optimization. Our innovations, including dynamic online kernel selection in the surrogate modeling phase and a unique UCTS search strategy, position our hybrid models as an advancement in mixed-variable surrogate models. Numerical experiments underscore the superiority of hybrid models, highlighting their potential in Bayesian optimization. Supplementary materials for this article are available online.

Acknowledgments

We gratefully acknowledge the Exascale Computing Project (17-SC-20-SC), a collaborative effort of the U.S. Department of Energy Office of Science and the National Nuclear Security Administration. We used resources of the National Energy Research Scientific Computing Center (NERSC), a U.S. Department of Energy Office of Science User Facility operated under Contract No. DE-AC02-05CH11231. We stored our code at https://github.com/gptune/hybridMinimization. We sincerely thank Riley J. Murray and Rahul Jain for additional experiments for randomized Kaczmarz algorithms and constructive suggestions for our hybrid model on various applications. We are grateful to the editor, the AE, and two anonymous reviewers for constructive comments and suggestions that have significantly improved the article.

Disclosure Statement

No potential conflict of interest was reported by the author(s).

Notes

1 This includes the implementation for roundrobin MAB and random MAB.

2 the number of layers includes the input/output layer but not the dropout layer.

3 all dense layers share the same activation function.

4 all dense layers share the same size.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 180.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.