Matern vs RBF input_dim and active-dimensions: operands could not be broadcasted together
See original GitHub issueI am currently trying to use GPy to implement additive kernels. I have two different versions
This is the first version
from GPy.kern.src.rbf import RBF
active_kernel = RBF(
input_dim=active_dimensions,
variance=1. if k_variance is None else k_variance,
lengthscale=1.5 if k_lengthscales is None else k_lengthscales, # 0.5,
ARD=True,
active_dims=np.arange(active_dimensions),
name="active_subspace_kernel"
)
self.kernel = active_kernel
self.kernel += RBF(
input_dim=1,
variance=2.,
lengthscale=0.5, # 0.5,
ARD=True,
active_dims=[active_dimensions + 1],
name="passive_subspace_kernel_dim_" + str(i)
)
This is the second version. The ONLY difference is that instead of RBF
we use the Matern32
kernel.
from GPy.kern.src.sde_matern import Matern32
active_kernel = Matern32(
input_dim=active_dimensions,
variance=1. if k_variance is None else k_variance,
lengthscale=1.5 if k_lengthscales is None else k_lengthscales, # 0.5,
ARD=True,
active_dims=np.arange(active_dimensions),
name="active_subspace_kernel"
)
self.kernel = active_kernel
self.kernel += RBF(
input_dim=1,
variance=2.,
lengthscale=0.5, # 0.5,
ARD=True,
active_dims=[active_dimensions + 1],
name="passive_subspace_kernel_dim_" + str(i)
)
However, when I set active_dimensions=2
, the RBF version runs without any problems. However, for the Matern version, I get the following error:
File "/Users/davidal/GoogleDrive/stuff/bacode/tripathy/src/boring/boring_model.py", line 181, in add_data
self.set_data(x, y, append=True)
File "/Users/davidal/GoogleDrive/stuff/bacode/tripathy/src/boring/boring_model.py", line 371, in set_data
k_lengthscales=l)
File "/Users/davidal/GoogleDrive/stuff/bacode/tripathy/src/boring/boring_model.py", line 109, in create_gp_and_kernels
self.create_gp()
File "/Users/davidal/GoogleDrive/stuff/bacode/tripathy/src/boring/boring_model.py", line 95, in create_gp
calculate_gradients=True
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/parameterized.py", line 53, in __call__
self.initialize_parameter()
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/parameter_core.py", line 337, in initialize_parameter
self.trigger_update()
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/updateable.py", line 79, in trigger_update
self._trigger_params_changed(trigger_parent)
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/parameter_core.py", line 134, in _trigger_params_changed
self.notify_observers(None, None if trigger_parent else -np.inf)
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/observable.py", line 91, in notify_observers
[callble(self, which=which) for _, _, callble in self.observers]
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/observable.py", line 91, in <listcomp>
[callble(self, which=which) for _, _, callble in self.observers]
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/core/parameter_core.py", line 508, in _parameters_changed_notification
self.parameters_changed()
File "/Users/davidal/febo/febo/models/gpy.py", line 124, in parameters_changed
self.kern.update_gradients_full(self.grad_dict['dL_dK'], self.X)
File "/Users/davidal/febo/febo/models/gpy.py", line 76, in update_gradients_full
self._dK_dr_via_X = self.dK_dr_via_X(X, X2)
File "<decorator-gen-138>", line 2, in dK_dr_via_X
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/caching.py", line 283, in g
return cacher(*args, **kw)
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/caching.py", line 179, in __call__
new_output = self.operation(*args, **kw)
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/GPy/kern/src/stationary.py", line 122, in dK_dr_via_X
return self.dK_dr(self._scaled_dist(X, X2))
File "<decorator-gen-140>", line 2, in _scaled_dist
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/caching.py", line 283, in g
return cacher(*args, **kw)
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/paramz/caching.py", line 179, in __call__
new_output = self.operation(*args, **kw)
File "/Users/davidal/miniconda3/lib/python3.6/site-packages/GPy/kern/src/stationary.py", line 165, in _scaled_dist
return self._unscaled_dist(X/self.lengthscale, X2)
ValueError: operands could not be broadcast together with shapes (1,3) (2,)
Do I interpret all parameters correctly? I also get the following warning:
/Users/davidal/miniconda3/lib/python3.6/site-packages/GPy/core/gp.py:87: UserWarning:Your kernel has a different input dimension 2 then the given X dimension 3. Be very sure this is what you want and you have not forgotten to set the right input dimenion in your kernel
which makes me think that I have not fully understood the input shape. However, it should be correct, the input shape for both kernels is 3
, and internally, this is split between the two kernels, right?
Any ideas and help is appreciated! 😃
Issue Analytics
- State:
- Created 5 years ago
- Comments:12 (7 by maintainers)
Top Results From Across the Web
python numpy ValueError: operands could not be broadcast ...
This operation is called broadcasting. Dimensions, where size is 1 or which are missing, can be used in broadcasting. In the example above...
Read more >sklearn.gaussian_process.kernels.Matern
The class of Matern kernels is a generalization of the RBF . It has an additional parameter ν which controls the smoothness of...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
What are you trying to achieve exactly?
The input dim is the number of dimensions to work on, active_dims specifies which dimensions to work on. If the kernel works on the first 3 dimensions, input_dim=3 and active_dims=range(3). You should be able to mix and match. You can have your kernels overlap on dimensions, or work on their own parts of X.
What is the value of active_dimensions?