Parameter Constraint at ALEBO
See original GitHub issueHey,
i´m trying to implement some parameter constraints in my ALEBO optimization. When adding the constraint the following Value Error occurs before the first trial evaluation starts. Without the constraint the optimization is running without any problems. The error occurs in the optimize() of the managed loop.
File "C:\code\black-box-opt\black-box-opt-server\experiment_runner.py", line 365, in run
generation_strategy=alebo_strategy)
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\service\managed_loop.py", line 246, in optimize
parameterization, values = loop.get_best_point()
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\service\managed_loop.py", line 200, in get_best_point
experiment=self.experiment
File "C:\code\black-box-opt\bbo-env\lib\site-packages\ax\service\utils\best_point.py", line 52, in get_best_raw_objective_point
raise ValueError("Cannot identify best point if experiment contains no data.")
Is there any possibility to implement parameter constraints in ALEBO?
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
ax.models.base - Adaptive Experimentation Platform
For both approaches, only points that satisfy parameter space constraints (bounds, linear_constraints, fixed_features) will be returned.
Read more >Parameter-dependent constraints - Modeling - The Stan Forums
Hi, I am wondering if it is possible to specify parameter constraints based on other parameters? Suppose I have parameters{ real<lower=0, upper=1> alpha; ......
Read more >Route Constraint? - MSDN - Microsoft
As far as I know, based on the multi languages url, you just need to add one parameter for that url routing like...
Read more >Re-Examining Linear Embeddings for High ... - OpenReview
A significant challenge in BO is to scale to high-dimensional parameter ... of the ALEBO embedding, in the appendix due to space constraints;...
Read more >Bandwidth constraints of disturbance observer in the presence ...
Request PDF | Bandwidth constraints of disturbance observer in the presence of real parametric uncertainties | Control systems based on ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@bletham Thank you! Your comment helped a lot! Maybe i´m going to implement it on my own later, but currently the constraint is not essential for my optimization so i will focus on other things first.
@Pzmijewski as you have discovered, the ALEBO implementation does not currently support parameter constraints.
In principal it can - the acquisition function optimization is anyway being done with a whole bunch of constraints to represent the high-dimensional box bounds, so adding a few more parameter constraints really wouldn’t be a big deal.
The challenges comes with generating the embedding. With parameter constraints, a totally random embedding might be a really bad choice - in fact there could be a chance of generating an embedding that is entirely infeasible with respect to the parameter constraint! (something that won’t happen with box bounds alone since the embedding is centered wrt the box bounds).
For example: consider a 2-d problem with a 1-d embedding. The box-bounds are the usual (for embedding BO) [-1, 1] and there is a parameter constraint of x_1 + x_2 >= 0.5 . Suppose I randomly generate the embedding B = [-1, 1]. It’s easy to see that within this embedding, every point violates the constraint x_1 + x_2 >= 0.5. If you sketch this out, you’ll be able to see that with this parameter constraint there is actually a relatively high chance of generating an embedding that is entirely infeasible. Which would obviously be bad; in the language of the paper, we know that P_opt is 0.
So if we have parameter constraints, we need to ensure that the embedding is generated in a way that produces an embedding with a high proportion of feasible volume. This is something I’ve thought about a little, but not enough to be able to say what the best approach would be, or to have implemented anything which is why it is just disallowed in the current ALEBO implementation. One natural thing that could be done would be to generate a whole bunch of random embeddings, estimate the feasible volume of each of them (for instance using Monte Carlo sampling), and then choose the embedding with the highest feasible volume. But this is a pretty big change in how the embedding is being generated so I think we’d want to check empirically that things are still working. An alternative approach would be to find an interior point in the high-dimensional feasible space (e.g. something like [0.75, 0.75] in the example above; in general an approximate centroid seems ideal), and then generate an embedding that guarantees that point is contained in the embedding but is otherwise random (there are d degrees of freedom in the d-dimensional embedding, and we would use one of them to force the embedding to contain a point interior to the feasible set, and then the other d-1 would be set randomly as usual). In any case, I think there is a bit of work to be done to figure out what approach actually works well, and that work hasn’t been done yet.
From the point of view of the code, not much needs to be done to add support for parameter constraints. The CenteredUnitX transform needs to add an operation to transform the constraints to the [-1, 1] hypercube (note that the constraints stay linear after this transformation). That would happen here: https://github.com/facebook/Ax/blob/d5b7de1dd99a2a1662a832a20f51b4a81e3cf34c/ax/modelbridge/transforms/centered_unit_x.py#L65-L68 and would be very similar to the code that we already have for transforming constraints to the [0, 1] hypercube here, just with minor algebraic differences: https://github.com/facebook/Ax/blob/d5b7de1dd99a2a1662a832a20f51b4a81e3cf34c/ax/modelbridge/transforms/unit_x.py#L68-L78 After that, the parameter constraints would successfully be passed along to the ALEBO model, which then would need minor modification to use them. Obviously this would have to go: https://github.com/facebook/Ax/blob/d5b7de1dd99a2a1662a832a20f51b4a81e3cf34c/ax/models/torch/alebo.py#L640 and in its place we would need to convert the constraint from the high-dimensional space (on x) to being a constraint in the embedding (on y) (since it’s a linear embedding, the constraint would still be linear in the embedding). We would then add the linear constraint(s) to the set of constraints here: https://github.com/facebook/Ax/blob/d5b7de1dd99a2a1662a832a20f51b4a81e3cf34c/ax/models/torch/alebo.py#L643-L646 and that’s all that would be required on the ALEBO model side.
The initializer would also need to correctly handle the constraint; this line would be removed: https://github.com/facebook/Ax/blob/d5b7de1dd99a2a1662a832a20f51b4a81e3cf34c/ax/models/random/alebo_initializer.py#L63 The (high-dimensional) parameter constraint would be passed along here when generating Sobol points in the high-dimensional space: https://github.com/facebook/Ax/blob/d5b7de1dd99a2a1662a832a20f51b4a81e3cf34c/ax/models/random/alebo_initializer.py#L65-L70 and then (a couple steps later) when we filter to points that respect the box bounds, we would also add the (high-dimensional) parameter constraint: https://github.com/facebook/Ax/blob/d5b7de1dd99a2a1662a832a20f51b4a81e3cf34c/ax/models/random/alebo_initializer.py#L78-L79
But given the open questions around generating the embedding in the presence of parameter constraints, we haven’t yet pushed on trying to add support for this (we also haven’t had a need yet on the application side).