predict_y and predict_log_density support for full_cov or full_output_cov
See original GitHub issueWith latest version of GPFlow 2.0.2, calling predict_y with a SVGP model with full_cov = True results on a covariance matrix where the likelihood noise value is added to every element in the covariance matrix, as opposed to only the diagonal as expected.
From looking at the code, this happens because SVGP does not implement predict_y, but inherits it from base class. However, the base class implementation doesn’t seem to be correct with full_cov = True.
To reproduce
import gpflow
import numpy as np
rng = np.random.RandomState(123)
N = 100 # Number of training observations
X = rng.rand(N, 1) * 2 - 1 # X values
M = 50 # Number of inducing locations
kernel = gpflow.kernels.SquaredExponential()
Z = X[:M, :].copy() # Initialize inducing locations to the first M inputs in the dataset
m = gpflow.models.SVGP(kernel, gpflow.likelihoods.Gaussian(), Z, num_data=N)
pX = np.linspace(-1, 1, 100)[:, None] # Test locations
_, pYv = m.predict_y(pX, full_cov = True) # Predict Y values at test locations
_, pFv = m.predict_f(pX, full_cov = True)
pYv_offdiag = pYv - (1.0 - np.eye(pYv.shape[0]))*pYv
pFv_offdiag = pFv - (1.0 - np.eye(pYv.shape[0]))*pFv
print(np.linalg.norm(pYv_offdiag - pFv_offdiag)) #should be 0
Expected behavior
Expected behavior is for off diagonals to match that of predict_f, however this isn’t the case. The last line of the above snippet should output 0.
System information
- GPflow version: 2.0.2
- GPflow installed from: pip
- TensorFlow version: 2.1
- Python version : 3.6
- Operating system: Ubuntu 16.04
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (5 by maintainers)
Top Results From Across the Web
Basic (Gaussian likelihood) GP regression model
predict_density returns the log density of the observations Ynew at Xnew . We use predict_f and predict_f_samples to plot 95% confidence intervals and...
Read more >Assessing the performance of prediction models - NCBI
The performance of prediction models can be assessed using a variety of different methods and metrics. Traditional measures for binary and survival outcomes ......
Read more >Classification and regression - Spark 3.3.1 Documentation
Logistic regression is a popular method to predict a categorical response. It is a special case of Generalized Linear models that predicts the...
Read more >Predict Loess Curve or Surface
Predictions from infinite inputs will be NA since loess does not support extrapolation. Note. Variables are first looked for in newdata and then...
Read more >Predict responses using trained deep learning neural ...
You can make predictions using a trained neural network for deep learning ... Using a GPU requires Parallel Computing Toolbox™ and a supported...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
HI @mohitrajpal1
Thanks for your excellent bug report. You are completely right: the likelihood variance is incorrectly added to all the elements of the covariance matrix whereas it should only have been added to the diagonal of the matrix. This is, for example, caused by unsolicited broadcasting in the
_predict_mean_and_var
in the Gaussian likelihood.As GPflow is an open source project we would very much appreciate if you could help us create a fix for this bug by submitting a PR. My initial plan would be to pass a keyword argument
full_cov: bool = False
topredict_mean_and_var
, which will allow to add the likelihood variance correctly depending on the kwarg. If you start a PR I’m happy to review and make sure it gets merged. Would you be up for that?Feel free to join our slack workspace (see README for details on how to join) if you want to discuss this in more detail.
Great! Looking forward to it.