target = self._cache[_hashable(x)] KeyError
See original GitHub issueHello,
I was trying to use the optimizer to find the best hyperparameters for the XGboostRegressor, however, I have encountered the following error and, I am not sure how to deal with it.
Traceback (most recent call last):
File "C:\Users\User\Anaconda3\envs\datascience\lib\site-packages\bayes_opt\target_space.py", line 191, in probe
target = self._cache[_hashable(x)]
KeyError: (0.449816047538945, 0.09507143064099162, 0.14907884894416698, 4.79597545259111, 1.7340279606636548, 1655.9945203362026, 0.05808361216819946, 0.8661761457749352, 0.8005575058716043)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Program Files\JetBrains\PyCharm Community Edition with Anaconda plugin 2019.1.3\plugins\python-ce\helpers\pydev\pydevd.py", line 1434, in _exec
pydev_imports.execfile(file, globals, locals) # execute the script
File "C:\Program Files\JetBrains\PyCharm Community Edition with Anaconda plugin 2019.1.3\plugins\python-ce\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "C:/Users/User/PycharmProjects/BayesOpt/asd.py", line 64, in <module>
optimize_xgboost(data=X_arr, targets=y_arr)
File "C:/Users/User/PycharmProjects/BayesOpt/asd.py", line 61, in optimize_xgboost
optimizer.maximize(n_iter=10)
File "C:\Users\User\Anaconda3\envs\datascience\lib\site-packages\bayes_opt\bayesian_optimization.py", line 185, in maximize
self.probe(x_probe, lazy=False)
File "C:\Users\User\Anaconda3\envs\datascience\lib\site-packages\bayes_opt\bayesian_optimization.py", line 116, in probe
self._space.probe(params)
File "C:\Users\User\Anaconda3\envs\datascience\lib\site-packages\bayes_opt\target_space.py", line 195, in probe
self.register(x, target)
File "C:\Users\User\Anaconda3\envs\datascience\lib\site-packages\bayes_opt\target_space.py", line 167, in register
self._target = np.concatenate([self._target, [target]])
File "<__array_function__ internals>", line 5, in concatenate
ValueError: all the input arrays must have same number of dimensions, but the array at index 0 has 1 dimension(s) and the array at index 1 has 2 dimension(s)
Here is the code I have used. I would be very greatful for any kind of help.
from bayes_opt import BayesianOptimization
from xgboost import XGBRegressor
from sklearn.model_selection import cross_val_score
import pandas as pd
SEED = 42
X = pd.read_pickle('X.pkl')
y = pd.read_pickle('y.pkl')
X_arr = X.to_numpy()
y_arr = y.to_numpy()
def xgboost_cv(colsample_bytree, gamma, learning_rate, max_depth, min_child_weight,
n_estimators, subsample, reg_alpha, reg_lambda, data, targets):
estimator = XGBRegressor(random_state=SEED, n_jobs=-1, colsample_bytree=colsample_bytree, gamma=gamma,
learning_rate=learning_rate, max_depth=max_depth, min_child_weight=min_child_weight,
n_estimators=n_estimators, subsample=subsample, reg_alpha=reg_alpha, reg_lambda=reg_lambda)
cval = cross_val_score(estimator, data, targets, scoring='neg_mean_squared_error', cv=4)
return -cval
def optimize_xgboost(data, targets):
def xgboost_crossval(colsample_bytree, gamma, learning_rate, max_depth, min_child_weight,
n_estimators, subsample, reg_alpha, reg_lambda):
return xgboost_cv(
colsample_bytree=colsample_bytree,
gamma=gamma,
learning_rate=learning_rate,
max_depth=int(max_depth),
min_child_weight=min_child_weight,
n_estimators=int(n_estimators),
subsample=subsample,
reg_alpha=reg_alpha,
reg_lambda=reg_lambda,
data=data,
targets=targets,
)
optimizer = BayesianOptimization(
f=xgboost_crossval,
pbounds={
'colsample_bytree': (0.3, 0.7),
'gamma': (0, 0.1),
'learning_rate': (0.01, 0.2),
'max_depth': (3, 6),
'min_child_weight': (1.5, 3),
'n_estimators': (1500, 2500),
'subsample': (0.5, 1),
'reg_alpha': (0, 1),
'reg_lambda': (0, 1),
},
random_state=SEED,
verbose=2
)
optimizer.maximize(n_iter=10)
print(f'Final result: {optimizer.max}')
optimize_xgboost(data=X_arr, targets=y_arr)
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (1 by maintainers)
Top Results From Across the Web
Python KeyError Exceptions and How to Handle Them
In this tutorial, you'll learn how to handle Python KeyError exceptions. They are often caused by a bad key lookup in a dictionary,...
Read more >KeyError Pandas – How To Fix - Data Independent
Pandas KeyError - This annoying error means that Pandas can not find your column name in your dataframe. Here's how to fix this...
Read more >I'm getting Key error in python - Stack Overflow
A KeyError generally means the key doesn't exist. So, are you sure the path key exists? From the official python docs: exception KeyError....
Read more >How to Fix KeyError Exceptions in Python - Rollbar
The Python KeyError is an exception that occurs when an attempt is made to access an item in a dictionary that does not...
Read more >How to fix Python KeyError Exceptions in simple steps?
Read about Self in Python as well! What is Exception Handling? Python raises exceptions when it encounters errors during ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Thank you very much for your reply. I have uploaded my question on GitHub, and I hope it can be solved. https://github.com/fmfn/BayesianOptimization/issues/352
Almost same error occurs in my program. TAT