Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Batch sizes do not work in Bayesian Optimization Loop

See original GitHub issue

The code here throws an error for me when batch size > 1.

I placed the following code into a Python file and ran it.

import GPy
import numpy as np

from emukit.bayesian_optimization.acquisitions import ExpectedImprovement
from emukit.bayesian_optimization.loops import BayesianOptimizationLoop
from emukit.core.continuous_parameter import ContinuousParameter
from emukit.core.interfaces import IModel
from emukit.core.loop import FixedIterationsStoppingCondition, UserFunctionWrapper
from emukit.core.parameter_space import ParameterSpace
from emukit.model_wrappers.gpy_model_wrappers import GPyModelWrapper

def f(x):
    return x ** 2

batch_size = 3
n_init = 5
n_iterations = 5

x_init = np.random.rand(n_init, 1)
y_init = np.random.rand(n_init, 1)

# Make GPy model
gpy_model = GPy.models.GPRegression(x_init, y_init)
model = GPyModelWrapper(gpy_model)

space = ParameterSpace([ContinuousParameter("x", 0, 1)])
acquisition = ExpectedImprovement(model)

# Make loop and collect points
bo = BayesianOptimizationLoop(model=model, space=space, acquisition=acquisition, batch_size=batch_size)
bo.run_loop(UserFunctionWrapper(f), FixedIterationsStoppingCondition(n_iterations))

# Check we got the correct number of points
assert bo.loop_state.X.shape[0] == n_iterations * batch_size + n_init

# Check the obtained results
results = bo.get_results()

assert results.minimum_location.shape[0] == 1
assert results.best_found_value_per_iteration.shape[0] == n_iterations * batch_size + n_init

It gives me the following error:

 ~/opt/miniconda3/envs/delivery-sim/lib/python3.9/site-packages/emukit/bayesian_optimization/acquisitions/ RuntimeWarning:divide by zero encountered in log

Traceback (most recent call last):
  File "~/Dev/eco-deliveries/Delivery_problem/", line 31, in <module>
    bo.run_loop(UserFunctionWrapper(f), FixedIterationsStoppingCondition(n_iterations))
  File "/Users/Alex/opt/miniconda3/envs/delivery-sim/lib/python3.9/site-packages/emukit/core/loop/", line 92, in run_loop
    new_x = self.candidate_point_calculator.compute_next_points(self.loop_state, context)
  File "~/opt/miniconda3/envs/delivery-sim/lib/python3.9/site-packages/emukit/bayesian_optimization/", line 85, in compute_next_points
    lipschitz_constant = _estimate_lipschitz_constant(self.parameter_space, self.model)
  File "~/opt/miniconda3/envs/delivery-sim/lib/python3.9/site-packages/emukit/bayesian_optimization/", line 112, in _estimate_lipschitz_constant
    lipschitz_constant =[0]
TypeError: 'float' object is not subscriptable

The error is thrown at this line because is not subscriptable. I’m running SciPy version 1.8.0.

Any thoughts?

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6

github_iconTop GitHub Comments

apaleyescommented, Feb 18, 2022

Thanks @iRove108 , managed to reproduce now. Indeed the bug only concerns scipy 1.8.0, and code works with 1.7.3

apaleyescommented, Feb 26, 2022

The fix is now merged, thanks @iRove108

Read more comments on GitHub >

github_iconTop Results From Across the Web

Batch Bayesian optimization using multi-scale search
Considering the fact that Bayesian optimization is a data efficient algorithm that operates with minimal observational data, estimating length-scale from small ...
Read more >
(PDF) Budgeted Batch Bayesian Optimization With Unknown ...
To set the batch size flexible, we use the infinite Gaussian mixture model (IGMM) for automatically identifying the number of peaks in the ......
Read more >
Batch Bayesian Optimization via Local Penalization
However, most proposed approaches only allow the explo- ration of the parameter space to occur se- quentially.
Read more >
A Review of Bayesian Optimization Share Your Story
It is popular for several reasons: 1) there are no free parameters other than the prior hyperparameters of the Bayesian model, 2) the...
Read more >
Distance Exploration for Scalable Batch Bayesian Optimization
Existing works have addressed batch BO in different ways. Still, the existing approaches are not scalable to large batch size.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found