question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Modeling a hierarchical search space using ParameterConstraint

See original GitHub issue

Hi everybody,

I saw that issue #140 already discusses the extension for arbitrary nesting. I was thinking about how to implement a hierarchical search space using the parameter constraints. E.g.

A = ChoiceParameter('A', parameter_type=ParameterType.BOOL, values=[True, False], is_ordered=False)
if A == True:
    B = ChoiceParameter('B', parameter_type=ParameterType.BOOL, values=[True, False], is_ordered=False) 

That means that we do not allow the combination A = False and B = True. So, with constraints that would look like this:

SearchSpace(
    parameters=[
        ChoiceParameter('A', parameter_type=ParameterType.BOOL, values=[True, False], is_ordered=False),
        ChoiceParameter('B', parameter_type=ParameterType.BOOL, values=[True, False], is_ordered=False)
    ],
    parameter_constraints=[ParameterConstraint(constraint_dict={'A': -1.0, 'B': 1.0}, bound=0.0)]
)

However, it seems like that ChoiceParameters are not supported for ParameterConstraints:

Traceback (most recent call last):
  File "/home/neutatz/Software/DeclarativeAutoML/fastsklearnfeature/declarative_automl/optuna_package/myautoml/cbays/hierarchical_space.py", line 35, in <module>
    sobol = Models.SOBOL(exp.search_space)
  File "/root/anaconda3/envs/DeclarativeAutoML/lib/python3.8/site-packages/ax/modelbridge/registry.py", line 261, in __call__
    model_bridge = bridge_class(
  File "/root/anaconda3/envs/DeclarativeAutoML/lib/python3.8/site-packages/ax/modelbridge/base.py", line 151, in __init__
    obs_feats, obs_data, search_space = self._transform_data(
  File "/root/anaconda3/envs/DeclarativeAutoML/lib/python3.8/site-packages/ax/modelbridge/base.py", line 196, in _transform_data
    search_space = t_instance.transform_search_space(search_space)
  File "/root/anaconda3/envs/DeclarativeAutoML/lib/python3.8/site-packages/ax/modelbridge/transforms/one_hot.py", line 138, in transform_search_space
    return SearchSpace(
  File "/root/anaconda3/envs/DeclarativeAutoML/lib/python3.8/site-packages/ax/core/search_space.py", line 48, in __init__
    self.set_parameter_constraints(parameter_constraints or [])
  File "/root/anaconda3/envs/DeclarativeAutoML/lib/python3.8/site-packages/ax/core/search_space.py", line 77, in set_parameter_constraints
    self._validate_parameter_constraints(parameter_constraints)
  File "/root/anaconda3/envs/DeclarativeAutoML/lib/python3.8/site-packages/ax/core/search_space.py", line 309, in _validate_parameter_constraints
    raise ValueError(
ValueError: `A` does not exist in search space.

I can work around it like this:

def eval(parameterization):
    A = parameterization.get("A") > 0.5
    B = parameterization.get("B") > 0.5

SearchSpace(
    parameters=[
        RangeParameter(name="A", parameter_type=ParameterType.FLOAT, lower=0.0, upper=1.0),
        RangeParameter(name="B", parameter_type=ParameterType.FLOAT, lower=0.0, upper=1.0)
    ],
    parameter_constraints=[ParameterConstraint(constraint_dict={'A': -1.0, 'B': 1.0}, bound=0.0)]
)

However, it doesn’t look so great. Does anybody have an idea how to make this nicer?

Best regards, Felix

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
sdaultoncommented, May 11, 2021

The hit and run sampler should work decently for this - @sdaulton how far are we away from hooking this up?

The hit and run sampler is currently used by default in BO for generating initial conditions for acquisition optimization. Using the hit and run sampler as an Ax RandomModel does not really seem like a high priority at the moment, if the rejection sampling with Sobol is not prohibitively slow and we are not hitting the max sample limit.

1reaction
Balandatcommented, May 4, 2021

The hit and run sampler should work decently for this - @sdaulton how far are we away from hooking this up?

Taking a step back, another option would be to just ignore the hierarchical structure on the modeling / optimization side and handle it in the evaluation code. Basically in your eval code you do B = B and A and on the Ax end you just don’t impose a constraint. The model should eventually be able to figure out that the value of B is irrelevant when A=False. This is of course not ideal, but it might be a better option than having to deal with rather brittle workarounds on the constraints.

cc @dme65 who has some experience with this and can probably also speak to it.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Source code for ax.core.search_space
Contains a set of Parameter objects, each of which have a name, type, and set of valid values. The search space also contains...
Read more >
Hierarchical Constraint Satisfaction in Spatial Databases
space. This paper shows how systematic and local search ... hierarchical constraint satisfaction using R-trees, and evaluate its performance with systematic ...
Read more >
Hierarchical search in the parameter space. The procedure is...
This paper presents a novel framework for partial matching and retrieval of 3-D models based on a query-by-range-image approach.
Read more >
Synthesis and Abstraction of Constraint Models for ...
Hierarchical Resource Allocation, Constraint Satisfaction Problems, Model Abstraction, ... exclude infeasible parts of the search space efficiently.
Read more >
Bayesian Network Learning with Parameter Constraints
The task of learning models for many real-world problems requires incorporating do- main knowledge into learning algorithms, to enable accurate learning ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found