question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Hierarchical search spaces

See original GitHub issue

Is there a way to define a hierarchy of parameters? for example a parameter that chooses architecture, and each architecture has its own parameters.

example (pseudo code):

architecture = choise(["NeuralNetwork","xgdboost"])

if architecture=="NeuralNetwork":
     n_layers = choise(range(1,10,1))
     #more architecture releted params here.

else if  architecture=="xgdboost":
    max_depth =  choise(range(1,5,1))
     #more architecture releted params here.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:12
  • Comments:12 (8 by maintainers)

github_iconTop GitHub Comments

4reactions
lena-kashtelyancommented, Jan 10, 2022

@yonatanMedan, @LyzhinIvan, @Tandon-A, @riyadparvez, BayesOpt mode is supported in alpha-mode now and currently works through search space flattening (so the Gaussian Process model is not aware of the hierarchical structure of the search space under the hood). cc @dme65 to say more about when BayesOpt over flattened search spaces is effective

If you try it, please let us know how it goes for you (ideally in this issue)! Updated version of my example above that should le you run BayesOpt:

from ax.service.ax_client import AxClient, ObjectiveProperties
from ax.service.utils.report_utils import exp_to_df
from ax.utils.measurement.synthetic_functions import branin

ax_client = AxClient()
ax_client.create_experiment(
    parameters=[
        {
            "name": "model",
            "type": "choice",
            "values": ["Linear", "XGBoost"],
            "dependents": {
                "Linear": ["learning_rate", "l2_reg_weight"],
                "XGBoost": ["num_boost_rounds"],
            },
        },
        {
            "name": "learning_rate",
            "type": "range",
            "bounds": [0.001, 0.1],
            "log_scale": True,
        },
        {
            "name": "l2_reg_weight",
            "type": "range",
            "bounds": [0.00001, 0.001],
        },
        {
            "name": "num_boost_rounds",
            "type": "range",
            "bounds": [0, 15],
        },
    ],
    objectives={"objective": ObjectiveProperties(minimize=True)},
    # To force "Sobol" if BayesOpt does not work well (please post a repro into 
    # a GitHub issue to let us know, it will be great help in debugging this faster!)
    # choose_generation_strategy_kwargs={"no_bayesian_optimization": True},
)
3reactions
riyadparvezcommented, Aug 20, 2019

Yes, this would be a great addition! I have a similar usecase - after hyperparameter optimization choose the right threshold for classification.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Unchain the Search Space with Hierarchical Differentiable ...
Such hierarchical search space greatly improves the performance of the networks without introducing expensive search cost.
Read more >
Hierarchical Neural Architecture Search | by Connor Shorten
Hierarchical Neural Architecture Search in 30 Seconds: ... The idea is to represent larger structures as a recursive composition of themselves.
Read more >
Unchain the Search Space with Hierarchical Differentiable ...
Hierarchical search space. In this work, we redesign the search space of DARTS-series methods, and propose a Hierarchical Differentiable Architecture Search (H ...
Read more >
HIERARCHICAL REPRESENTATIONS FOR EFFICIENT ...
In this work we constrain the search space by imposing a hierarchical network structure, while allowing flexible network topologies (directed acyclic ...
Read more >
Surrogates for hierarchical search spaces - ACM Digital Library
Optimization in hierarchical search spaces deals with variables that only have an influence on the objective function if other variables ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found