High-dimensional BayesOpt: appropriate optimization algorithm for large number of parameters (GPEI is very slow with 128 hyperparameters)
See original GitHub issueHi everybody,
I tried to run Ax for my use case. I have 128 hyperparameters. Unfortunately, sampling new configurations takes a very long time. Here, I did a quick breakdown of what costs how much time:
sobol = Models.SOBOL(exp.search_space)
for i in range(5):
exp.new_trial(generator_run=sobol.gen(1))
=> 0.23s
eval_data = exp.eval()
=> 1.46s
gpei = Models.GPEI(experiment=exp, data=eval_data)
=> 0.26s
batch = exp.new_trial(generator_run=gpei.gen(1))
=> 390.18s
I would be happy if any of you had some hints on how to speed up the sampling. Is there a better model or sampling strategy to speed things up here?
Thank you for your help.
Best regards, Felix
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
TuRBO support in Ax · Issue #474 · facebook/Ax - GitHub
High-dimensional BayesOpt: appropriate optimization algorithm for large number of parameters (GPEI is very slow with 128 hyperparameters) #579.
Read more >[1902.10675] High-dimensional Bayesian optimization using ...
Bayesian optimization (BO) is a powerful approach for seeking the global optimum of expensive black-box functions and has proven successful for fine tuning ......
Read more >Why does Bayesian Optimization perform poorly in more than ...
To be completely honest, it's because everything performs poorly in more than 20 dimensions. Bayesian optimization isn't special here.
Read more >A Review of Bayesian Optimization
review paper introduces Bayesian optimization, highlights some of its methodological aspects, and showcases a wide range of applications. I. INTRODUCTION.
Read more >Hyperparameter Optimization - AutoML
depend on a wide range of hyperparameter choices about the neural network's archi- ... Let A denote a machine learning algorithm with N...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Re TurBO, we have #474 open - though we haven’t found the time to work on this much recently.
Yeah d=128 is much too high for usual BayesOpt. I see three possible options: