question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Using batch-GP for learnign single common GP over multiple experiments

See original GitHub issue

Howdy folks,

Reading the docs, I understand that batch-GP is meant to learn k independent GPs, from k independent labels y over a common data set x.

y1 = f1(x), y2 = f2(x), ..., yk = fk(x) , for k independent GPs.

But how would one go about using batch-GP to learn a single common GP, from k independent experiments of the same underlying process?

y1=f(x1), y2 = f(x2), ..., yk=f(xk) for one and the same GP

For instance, I have k sets of data and labels (y) representing measurements of how the temperature changes over altitude (x) (e.g. from weather balloons launched at k different geographical locations), and I want to induce a GP prior hat represents the temperature change over altitude between mean sea level and some maximum altitude, marginalized over the all geographical areas.

Thanks in advance

Galto

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:25 (9 by maintainers)

github_iconTop GitHub Comments

3reactions
jilsamiacommented, Jun 11, 2021

Hi, I also have the same issue, my input data has a dimension of N*10 and I want to have the same gaussian process for all the features. I used this tutorial as a reference https://docs.gpytorch.ai/en/v1.1.1/examples/06_PyTorch_NN_Integration_DKL/Deep_Kernel_Learning_DenseNet_CIFAR_Tutorial.html but they learn one GP for every feature. Anyone knows how to do that ? Thanks

0reactions
edebrouwercommented, Jul 16, 2021

Any update regarding training a single GP on different datasets ?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Efficient Online and Batch Learning using Forward Backward ...
Abstract. We describe, analyze, and experiment with a framework for empirical loss minimization with regularization. Our algorithmic framework alternates ...
Read more >
Robustifying genomic classifiers to batch effects via ensemble ...
In practice, batch effects are usually addressed by specifically designed software, which merge the data from different batches, then estimate batch effects and ......
Read more >
Batch gradient descent versus stochastic gradient descent
Stochastic gradient descent (SGD) computes the gradient using a single sample. Most applications of SGD actually use a minibatch of several samples, ...
Read more >
Ways to improve GAN performance | by Jonathan Hui
Minibatch discrimination allows us to generate visually appealing samples very quickly, and in this regard it is superior to feature matching. One-sided label ......
Read more >
Effect of batch size on training dynamics | by Kevin Shen
Batch size is one of the most important hyperparameters to tune in modern deep learning systems. Practitioners often want to use a larger...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found