Multiple Equality Constraints
See original GitHub issueHello,
Ive been trying out Ax and I really Like it. I was trying to create a three stage regression model and have Ax infer the weights to be assigned to each regressor. I am not able to create a constraint that uses three parameters and have them be equal to 0.
I have used the following code:
from ax.service.ax_client import AxClient
from ax.utils.measurement.synthetic_functions import branin
from sklearn.ensemble import RandomForestRegressor
from sklearn.gaussian_process import GaussianProcessRegressor
from axtrainer.data import DATA_SET_DICT
from sklearn.linear_model import LinearRegression
# from axtrainer.logger import *
from ax import ChoiceParameter, ParameterType
import xgboost as xgb
import numpy as np
from sklearn.metrics import mean_squared_error, explained_variance_score
# from axtrainer.trainer import DATA_SET_DICT
import os
PROBLEM_TYPE = os.environ.get("PROBLEM_TYPE", "REGRESSION")
# Helper function for parameter handling
def make_parameter(name, ptype, bounds, value_type):
''' Creates a parameter dictionary to be used in ax.create_experiment'''
if ptype == "range":
return dict(name=name, type=ptype, bounds=bounds, value_type=value_type)
elif ptype == "choice":
return dict(name=name, type=ptype, values=bounds, value_type=value_type)
# Function to return our target cost function and optimize parameters with ax.Client
def train_and_return_score(w1=1/3.0,w2=1/3.0,w3=1/3.0, **kwargs):
''' Convinience function to train model and return score'''
if PROBLEM_TYPE == "REGRESSION":
Model = xgb.XGBRegressor
elif PROBLEM_TYPE == "CLASSIFICATION":
Model = xgb.XGBClassifier
X_train, X_test, y_train, y_test = DATA_SET_DICT["X_train"], DATA_SET_DICT[
"X_test"], DATA_SET_DICT["y_train"], DATA_SET_DICT["y_test"]
# Instantiate model with keyword arguments
estimators = [
RandomForestRegressor(n_estimators=30),
Model(n_jobs=-1,gpu_id=0, **kwargs)
]
for model in estimators:
model.fit(X_train, y_train)
preds = np.array(list(model.predict(X_test) for model in estimators))
# Weighted sum of models
preds = np.array((w1,w2)) @ preds
_score = explained_variance_score(y_test, preds)
# print("MODEL SCORE: %s " % _score)
return 1 - _score
PARAMETERS = [
make_parameter("w1", "range", [0, .99], "float"),
make_parameter("w2", "range", [0, .99], "float"),
]
CONSTRAINTS = ["w1 + w2 = 1.0",]
Lets say I wanted to have more than two parameters itnteract with each other such as having three weights and three models. I know that there are other ways to do what I am specifically doing but I am trying to get an understanding of tuning with Ax. Will the following be possible as a constraint: w1 + w2 + w3 == 1.0. Will this be possible using ax anytime soon? Is there a limiation to bayesian optimization that will not allow this functionality?
When I do try to do something like this I get the following error:
Traceback (most recent call last):
File "run.py", line 6, in <module>
ax, b , m = main()
File "/home/david/Desktop/ax-container/app/axtrainer/weighted_model.py", line 74, in main
minimize=True,
File "/home/david/miniconda3/envs/threeseven/lib/python3.7/site-packages/ax/service/ax_client.py", line 115, in create_experiment
outcome_constraints=outcome_constraints,
File "/home/david/miniconda3/envs/threeseven/lib/python3.7/site-packages/ax/service/utils/instantiation.py", line 225, in make_experiment
else [constraint_from_str(c, parameter_map) for c in parameter_constraints],
File "/home/david/miniconda3/envs/threeseven/lib/python3.7/site-packages/ax/service/utils/instantiation.py", line 225, in <listcomp>
else [constraint_from_str(c, parameter_map) for c in parameter_constraints],
File "/home/david/miniconda3/envs/threeseven/lib/python3.7/site-packages/ax/service/utils/instantiation.py", line 160, in constraint_from_str
"Parameter constraint should be of form `metric_name` >= `other_metric_name` "
AssertionError: Parameter constraint should be of form `metric_name` >= `other_metric_name` for order constraints or `metric_name` + `other_metric_name` >= x, where x is a float bound, and acceptable comparison operators are >= and <=.
I am using Python 3.7 on Ubuntu in an anaconda enviornment.
Issue Analytics
- State:
- Created 4 years ago
- Comments:5 (4 by maintainers)
Top GitHub Comments
Thank you for raising this, @dkatz23238! Currently working on this, actually : ) We’ll include 2+ parameter constraints with the upcoming version. It’s not at all a system limitation, it’s just how the string parsing is currently organized for the Service API.
As for an equality constraint (“x1 + x2 == 1.0” as opposed to “x1 + x2 >= 1.0”), that’s a bit of a different story, as tight constraints like that may pose a difficulty in some of the modeling. One solution would be to optimize over a subspace: to only include x1 into your search space, and set
x2=1-x1
in evaluation? Similarly, if you had 3 parameters: x1, x2, and x3 –– you could include x1 and x2 into the search space, and setx3=1-(x1+x2)
when evaluating trials.Closing, since this should be in our current release (0.1.2)!