question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

High-dimensional BayesOpt: appropriate optimization algorithm for large number of parameters (GPEI is very slow with 128 hyperparameters)

See original GitHub issue

Hi everybody,

I tried to run Ax for my use case. I have 128 hyperparameters. Unfortunately, sampling new configurations takes a very long time. Here, I did a quick breakdown of what costs how much time:

sobol = Models.SOBOL(exp.search_space)
for i in range(5):
    exp.new_trial(generator_run=sobol.gen(1))

=> 0.23s

eval_data = exp.eval()

=> 1.46s

gpei = Models.GPEI(experiment=exp, data=eval_data)

=> 0.26s

batch = exp.new_trial(generator_run=gpei.gen(1))

=> 390.18s

I would be happy if any of you had some hints on how to speed up the sampling. Is there a better model or sampling strategy to speed things up here?

Thank you for your help.

Best regards, Felix

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

3reactions
Balandatcommented, May 11, 2021

Re TurBO, we have #474 open - though we haven’t found the time to work on this much recently.

3reactions
blethamcommented, May 11, 2021

Yeah d=128 is much too high for usual BayesOpt. I see three possible options:

  • Try to identify the most important 10 or so and optimize only those (obviously not ideal if there are many interdependencies between parameters and they are all important to some degree, but could get you most of the way there)
  • If they are all continuous parameters you could try the ALEBO high-dimensional optimization method that is built into Ax (https://github.com/facebookresearch/alebo), though it does assume that there is some low-dimensional structure to the search space (e.g. most of the parameters are unimportant) and it’s performance will really depend on the degree to which that is true. It will not work well with categorical / choice parameters.
  • You can use the TuRBO method for high-dimensional BayesOpt, which is not in Ax but has an open source implementation here: https://github.com/uber-research/TuRBO . It’s probably the best general-purpose high-dimensional BayesOpt method for problems without any particular structure.
Read more comments on GitHub >

github_iconTop Results From Across the Web

TuRBO support in Ax · Issue #474 · facebook/Ax - GitHub
High-dimensional BayesOpt: appropriate optimization algorithm for large number of parameters (GPEI is very slow with 128 hyperparameters) #579.
Read more >
[1902.10675] High-dimensional Bayesian optimization using ...
Bayesian optimization (BO) is a powerful approach for seeking the global optimum of expensive black-box functions and has proven successful for fine tuning ......
Read more >
Why does Bayesian Optimization perform poorly in more than ...
To be completely honest, it's because everything performs poorly in more than 20 dimensions. Bayesian optimization isn't special here.
Read more >
A Review of Bayesian Optimization
review paper introduces Bayesian optimization, highlights some of its methodological aspects, and showcases a wide range of applications. I. INTRODUCTION.
Read more >
Hyperparameter Optimization - AutoML
depend on a wide range of hyperparameter choices about the neural network's archi- ... Let A denote a machine learning algorithm with N...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found