Hierarchical search spaces
See original GitHub issueIs there a way to define a hierarchy of parameters? for example a parameter that chooses architecture, and each architecture has its own parameters.
example (pseudo code):
architecture = choise(["NeuralNetwork","xgdboost"])
if architecture=="NeuralNetwork":
n_layers = choise(range(1,10,1))
#more architecture releted params here.
else if architecture=="xgdboost":
max_depth = choise(range(1,5,1))
#more architecture releted params here.
Issue Analytics
- State:
- Created 4 years ago
- Reactions:12
- Comments:12 (8 by maintainers)
Top Results From Across the Web
Unchain the Search Space with Hierarchical Differentiable ...
Such hierarchical search space greatly improves the performance of the networks without introducing expensive search cost.
Read more >Hierarchical Neural Architecture Search | by Connor Shorten
Hierarchical Neural Architecture Search in 30 Seconds: ... The idea is to represent larger structures as a recursive composition of themselves.
Read more >Unchain the Search Space with Hierarchical Differentiable ...
Hierarchical search space. In this work, we redesign the search space of DARTS-series methods, and propose a Hierarchical Differentiable Architecture Search (H ...
Read more >HIERARCHICAL REPRESENTATIONS FOR EFFICIENT ...
In this work we constrain the search space by imposing a hierarchical network structure, while allowing flexible network topologies (directed acyclic ...
Read more >Surrogates for hierarchical search spaces - ACM Digital Library
Optimization in hierarchical search spaces deals with variables that only have an influence on the objective function if other variables ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@yonatanMedan, @LyzhinIvan, @Tandon-A, @riyadparvez, BayesOpt mode is supported in alpha-mode now and currently works through search space flattening (so the Gaussian Process model is not aware of the hierarchical structure of the search space under the hood). cc @dme65 to say more about when BayesOpt over flattened search spaces is effective
If you try it, please let us know how it goes for you (ideally in this issue)! Updated version of my example above that should le you run BayesOpt:
Yes, this would be a great addition! I have a similar usecase - after hyperparameter optimization choose the right threshold for classification.