question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Gradients for Acquisition are wrong when `normalizer=True`

See original GitHub issue

Hi,

It appears that the gradients of the acquisition functions are wrong when normalizer=True is used in the model definition. This is because model.predictive_gradients in GPy (which is called by model.get_prediction_gradients in Emukit) does not account for normalization. I raised this issue here and made a pull request to fix it.

I don’t think Emukit enforces or recommends to use normalizer=False anywhere. This is problematic because it is up to the user to define their own model “upstream” of the optimization loop. I suspect that many people are tempted to use normalizer=True without knowing that the gradients of their acquisition function will be wrong.

If the pull request I made is accepted, then there is nothing to do except to tell people that they should use the latest (devel?) version of GPy.

If I am missing anything, please let me know.

Thanks, Antoine


import numpy as np
import GPy
from GPy.models import GradientChecker
from emukit.model_wrappers.gpy_model_wrappers import GPyModelWrapper
from emukit.bayesian_optimization.acquisitions import ProbabilityOfImprovement


M, Q = 15, 3
X = np.random.rand(M,Q)
Y = np.random.rand(M,1)
x = np.random.rand(1, Q)
model = GPy.models.GPRegression(X=X, Y=Y, normalizer=True)
emukit_model = GPyModelWrapper(model)
acq = ProbabilityOfImprovement(emukit_model)
g = GradientChecker(lambda x: acq.evaluate_with_gradients(x)[0],
                    lambda x: acq.evaluate_with_gradients(x)[1],
                    x, 'x')
assert(g.checkgrad())

Issue Analytics

  • State:open
  • Created 4 years ago
  • Comments:8

github_iconTop GitHub Comments

1reaction
AnthonyLarroquecommented, Mar 9, 2020

Hi Antoine,

Thank you for advising us and for the fix !

0reactions
ablanchacommented, Mar 9, 2020

FYI: #806 was merged to GPy’s devel.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Gradients for Acquisition are wrong when normalizer=True #279
Hi,. It appears that the gradients of the acquisition functions are wrong when normalizer=True is used in the model definition.
Read more >
Are Gradient Errors Linear-Time-Invariant? - PMC - NCBI
Gradient delays and eddy currents can cause differences between the specified k-space trajectory and the k-space trajectory that is actually acquired. If these ......
Read more >
Why don't we solve vanishing/exploding gradients problem ...
I think knowing the direction of the gradient is enough, we do not care its magnitude, we can normalize its magnitude. Where am...
Read more >
Max Normalization - an overview | ScienceDirect Topics
This method of normalization is useful when the actual minimum and maximum of attribute A are unknown, or when there are outliers that...
Read more >
True FISP/FIESTA - Questions and Answers ​in MRI
GE calls theirs FIESTA (Fast Imaging Employing Steady-state Acquisition) and ... Thus True FISP sequences behave more like spin echo than gradient echo ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found