how to force models to make more exploration
See original GitHub issueHi,
I would like to thank everyone who contributed to this great library. It enables easy use of Bayesian optimization to solve problems with the state of the art algorithms.
I have implemented Ax for my single objective design optimization study. Here is the code snippet: ` from ax.service.ax_client import AxClient from ax.modelbridge.generation_strategy import GenerationStep, GenerationStrategy from ax.modelbridge.registry import Models
def objective_function(x): # region of f calculation # gives ‘ErrorDesign’ in case of error, otherwise float. return {“f”: (f, 0.0)}
gs = GenerationStrategy( steps=[GenerationStep(model=Models.SOBOL,num_trials =20), GenerationStep(model=Models.GPMES,num_trials=-1), ])
ax_client = AxClient(generation_strategy=gs)
ax_client.create_experiment(
name=“single_objective_design”,
parameters=[
{“name”: “x1”, “type”: “range”,“bounds”: [0.2, 1.0],“value_type”: “float”},
{“name”: “x2”, “type”: “range”,“bounds”: [2.0, 6.0],“value_type”: “float”},
{“name”: “x3”, “type”: “range”,“bounds”: [0.2, 1.0],“value_type”: “float”},
{“name”: “x4”, “type”: “range”,“bounds”: [1.7, 8.7],“value_type”: “float”},
{“name”: “x5”, “type”: “range”,“bounds”: [ 0, 25],“value_type”: “int”},
{“name”: “x6”," type": “range”,“bounds”: [4.0,12.0],“value_type”: “float”},
{“name”: “x7”, “type”: “range”,“bounds”: [2.0, 5.0],“value_type”: “float”},
{“name”: “x8”, “type”: “range”,“bounds”: [0.2, 1.0],“value_type”: “float”},
{“name”: “x9”, “type”: “range”,“bounds”: [80., 95.],“value_type”: “float”},
{“name”: “x10”,“type”: “range”,“bounds”: [ 0, 25],“value_type”: “int”},
{“name”: “x11”,“type”: “choice”,“values”:[“4”,“8”,“12”,“16”],“value_type”: “str”},
{“name”: “x12”,“type”: “choice”,“values”:[“4”,“8”,“12”,“16”],“value_type”: “str”},
],
objective_name=“f”,
minimize=True)
for _ in range(200): trial_params, trial_index = ax_client.get_next_trial() data = objective_function(trial_params) if data[“f”][0] == ‘ErrorDesign’: ax_client.log_trial_failure(trial_index=trial_index) else: ax_client.complete_trial(trial_index=trial_index, raw_data=data[“f”]) `
I have 12 design parameters (10 ranges, 2 choices) to be optimized and benefit service API with generation strategies ([sobol + gpmes, sobol + gpei, sobol + botorch, sobol + gpkg]) as seen in the code snippet. I am using python3.8 and the latest versions of botorch, gpytorch, and torch libraries.
Below is the history plot showing objective values with respect to iteration number for different models after running code respectively. I have also added the history of design parameters for the GPEI Model.
My question is about the non-explorative search behavior of the models after 20 sobol iterations. As you see from the objective history figure, successive designs have close objective values. Indeed, I would expect the code to do more exploration since the search space is quite large, but each model quickly converge some local minimum and continue to search around that minimum. By the way, the global minimum of the objective function is around -3.6.
I have tried the followings, but code behavior is not much affected:
- Repeated runs with different sobol initializations
- Increasing sobol trial number
- Increasing num_fantasies, num_mv_samples, num_y_samples, candidate_size
Any help to force these generation strategies into making more exploration would be appreciated. Thanks in advance.
Issue Analytics
- State:
- Created 3 years ago
- Comments:14 (5 by maintainers)
Top GitHub Comments
Hi @samueljamesbell ,
I have updated recently my setup after the addition of
BOTORCH_MODULAR
feature to ax. I have added the new setup below and recommend using it:To make run above code, input constructor for acquisition class
qNegIntegratedPosteriorVariance
should be registered in\botorch\acquisiton\input_constructer.py
file ofbotorch
library. So, the below code should also be appended in the corresponding file:Dimension problem may occur due to the
objective
variable being considered as multi-output inNegIntegratedPosteriorVariance
class if it is not equal toNone
. In above registration code, it is set toNone
for single output case. And,mc_points
can be given in N x D format for this setup. I hope it helps and developers may correct me if anything is wrong.Yeah qNIPV is agnostic to the direction, the goal is to minimize a global measure of uncertainty of the model, so there is no better or worse w.r.t. the function values.