Using batch-GP for learnign single common GP over multiple experiments
See original GitHub issueHowdy folks,
Reading the docs, I understand that batch-GP is meant to learn k independent GPs, from k independent labels y over a common data set x.
y1 = f1(x), y2 = f2(x), ..., yk = fk(x)
, for k independent GPs.
But how would one go about using batch-GP to learn a single common GP, from k independent experiments of the same underlying process?
y1=f(x1), y2 = f(x2), ..., yk=f(xk)
for one and the same GP
For instance, I have k sets of data and labels (y) representing measurements of how the temperature changes over altitude (x) (e.g. from weather balloons launched at k different geographical locations), and I want to induce a GP prior hat represents the temperature change over altitude between mean sea level and some maximum altitude, marginalized over the all geographical areas.
Thanks in advance
Galto
Issue Analytics
- State:
- Created 4 years ago
- Comments:25 (9 by maintainers)
Top GitHub Comments
Hi, I also have the same issue, my input data has a dimension of N*10 and I want to have the same gaussian process for all the features. I used this tutorial as a reference https://docs.gpytorch.ai/en/v1.1.1/examples/06_PyTorch_NN_Integration_DKL/Deep_Kernel_Learning_DenseNet_CIFAR_Tutorial.html but they learn one GP for every feature. Anyone knows how to do that ? Thanks
Any update regarding training a single GP on different datasets ?