question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Matern vs RBF input_dim and active-dimensions: operands could not be broadcasted together

See original GitHub issue

I am currently trying to use GPy to implement additive kernels. I have two different versions

This is the first version

from GPy.kern.src.rbf import RBF

active_kernel = RBF(
   input_dim=active_dimensions,
   variance=1. if k_variance is None else k_variance,
   lengthscale=1.5 if k_lengthscales is None else k_lengthscales, # 0.5,
   ARD=True,
   active_dims=np.arange(active_dimensions),
   name="active_subspace_kernel"
)
self.kernel = active_kernel

self.kernel += RBF(
    input_dim=1,
    variance=2.,
    lengthscale=0.5, # 0.5,
    ARD=True,
    active_dims=[active_dimensions + 1],
    name="passive_subspace_kernel_dim_" + str(i)
)

This is the second version. The ONLY difference is that instead of RBF we use the Matern32 kernel.

from GPy.kern.src.sde_matern import Matern32

active_kernel = Matern32(
   input_dim=active_dimensions,
   variance=1. if k_variance is None else k_variance,
   lengthscale=1.5 if k_lengthscales is None else k_lengthscales, # 0.5,
   ARD=True,
   active_dims=np.arange(active_dimensions),
   name="active_subspace_kernel"
)
self.kernel = active_kernel

self.kernel += RBF(
    input_dim=1,
    variance=2.,
    lengthscale=0.5, # 0.5,
    ARD=True,
    active_dims=[active_dimensions + 1],
    name="passive_subspace_kernel_dim_" + str(i)
)

However, when I set active_dimensions=2, the RBF version runs without any problems. However, for the Matern version, I get the following error:

  File "/Users/davidal/GoogleDrive/stuff/bacode/tripathy/src/boring/boring_model.py", line 181, in add_data
    self.set_data(x, y, append=True)
  File "/Users/davidal/GoogleDrive/stuff/bacode/tripathy/src/boring/boring_model.py", line 371, in set_data
    k_lengthscales=l)
  File "/Users/davidal/GoogleDrive/stuff/bacode/tripathy/src/boring/boring_model.py", line 109, in create_gp_and_kernels
    self.create_gp()
  File "/Users/davidal/GoogleDrive/stuff/bacode/tripathy/src/boring/boring_model.py", line 95, in create_gp
    calculate_gradients=True
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/parameterized.py", line 53, in __call__
    self.initialize_parameter()
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/parameter_core.py", line 337, in initialize_parameter
    self.trigger_update()
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/updateable.py", line 79, in trigger_update
    self._trigger_params_changed(trigger_parent)
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/parameter_core.py", line 134, in _trigger_params_changed
    self.notify_observers(None, None if trigger_parent else -np.inf)
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/observable.py", line 91, in notify_observers
    [callble(self, which=which) for _, _, callble in self.observers]
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/observable.py", line 91, in <listcomp>
    [callble(self, which=which) for _, _, callble in self.observers]
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/parameter_core.py", line 508, in _parameters_changed_notification
    self.parameters_changed()
  File "/Users/davidal/febo/febo/models/gpy.py", line 124, in parameters_changed
    self.kern.update_gradients_full(self.grad_dict['dL_dK'], self.X)
  File "/Users/davidal/febo/febo/models/gpy.py", line 76, in update_gradients_full
    self._dK_dr_via_X = self.dK_dr_via_X(X, X2)
  File "<decorator-gen-138>", line 2, in dK_dr_via_X
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/caching.py", line 283, in g
    return cacher(*args, **kw)
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/caching.py", line 179, in __call__
    new_output = self.operation(*args, **kw)
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/GPy/kern/src/stationary.py", line 122, in dK_dr_via_X
    return self.dK_dr(self._scaled_dist(X, X2))
  File "<decorator-gen-140>", line 2, in _scaled_dist
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/caching.py", line 283, in g
    return cacher(*args, **kw)
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/caching.py", line 179, in __call__
    new_output = self.operation(*args, **kw)
  File "/Users/davidal/miniconda3/lib/python3.6/site-packages/GPy/kern/src/stationary.py", line 165, in _scaled_dist
    return self._unscaled_dist(X/self.lengthscale, X2)
ValueError: operands could not be broadcast together with shapes (1,3) (2,) 

Do I interpret all parameters correctly? I also get the following warning:

/Users/davidal/miniconda3/lib/python3.6/site-packages/GPy/core/gp.py:87: UserWarning:Your kernel has a different input dimension 2 then the given X dimension 3. Be very sure this is what you want and you have not forgotten to set the right input dimenion in your kernel

which makes me think that I have not fully understood the input shape. However, it should be correct, the input shape for both kernels is 3, and internally, this is split between the two kernels, right?

Any ideas and help is appreciated! 😃

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:12 (7 by maintainers)

github_iconTop GitHub Comments

2reactions
mzwiesselecommented, Jun 26, 2018

What are you trying to achieve exactly?

The input dim is the number of dimensions to work on, active_dims specifies which dimensions to work on. If the kernel works on the first 3 dimensions, input_dim=3 and active_dims=range(3). You should be able to mix and match. You can have your kernels overlap on dimensions, or work on their own parts of X.

On 26. Jun 2018, at 22:33, David notifications@github.com wrote:

Thanks a lot, but it does not solve it 😕

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

1reaction
mzwiesselecommented, Jun 26, 2018

What is the value of active_dimensions?

On 26. Jun 2018, at 22:48, David notifications@github.com wrote:

I want the first two dimensions of the X to be calculated by the first kernel (Matern32), and the third dimension of X to be calculated by the second kernel (RBF). That should be correctly implemented, no?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.

Read more comments on GitHub >

github_iconTop Results From Across the Web

python numpy ValueError: operands could not be broadcast ...
This operation is called broadcasting. Dimensions, where size is 1 or which are missing, can be used in broadcasting. In the example above...
Read more >
sklearn.gaussian_process.kernels.Matern
The class of Matern kernels is a generalization of the RBF . It has an additional parameter ν which controls the smoothness of...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found