GP implementation of a precomputed covariance matrix
See original GitHub issuePyMC3 supports multiplying a GP covariance function with a precomputed covariance matrix, as seen with the example below copied from the PyMC3 docs (https://docs.pymc.io/notebooks/GP-MeansAndCovs.html). Is it possible to do something similar directly with exoplanet?
Thanks!
import matplotlib.pyplot as plt
import pymc3 as pm
import theano
import numpy as np
# data vector
X = np.linspace(0, 2, 200)[:,None]
# first evaluate a covariance function into a matrix
period = 0.2
cov_cos = pm.gp.cov.Cosine(1, period)
K_cos = theano.function([], cov_cos(X))()
# now multiply it with a covariance *function*
cov = pm.gp.cov.Matern32(1, 0.5) * K_cos
K = cov(X).eval()
plt.figure(figsize=(14,4))
plt.plot(X, pm.MvNormal.dist(mu=np.zeros(K.shape[0]), cov=K).random(size=3).T);
plt.title("Samples from the GP prior");
plt.ylabel("y");
plt.xlabel("X");
Issue Analytics
- State:
- Created 4 years ago
- Comments:10 (4 by maintainers)
Top Results From Across the Web
10.3 Fitting a Gaussian Process | Stan User's Guide
The Stan program implementing the marginal likelihood GP is shown below. ... The computation of the covariance matrix K is now in the...
Read more >Mean and Covariance Functions — PyMC example gallery
gp.mean.Linear is a takes as input a matrix of coefficients and a vector of intercepts (or a slope and scalar intercept in one...
Read more >The GP object — George 0.4.0 documentation - Read the Docs
The core element of George is the GP object. ... Pre-compute the covariance matrix and factorize it for a set of times and...
Read more >GP Regression with LOVE for Fast Predictive Variances and ...
Compute the empirical covariance matrices¶. Let's see how well LOVE samples and exact samples recover the true covariance matrix. [13]:.
Read more >Constant-Time Predictive Distributions for Gaussian Processes
process (GP) regression is its ability to provide ... proximate the predictive covariance matrix. Our ... in many application domains.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
You definitely don’t want to fit the ACF - just fit the GP directly to the out of transit points then you can fix the hyperparameters. In general, I find that this doesn’t make much of a difference compared to just marginalizing over the hyperparameters simultaneously with the transit fit (especially if the correlations are on a transit timescale), but maybe it’ll matter more in your case!
What I meant by that is just that PyMC3’s GP implementation is just using a standard Cholesky factorization to compute the GP so it doesn’t care about the details of the kernel! If PyMC3’s GP is fast enough then you can just use that.
I think you probably mean “banded” rather than “block diagonal” here. In that case, there are fast things that you can do, but I’m not sure that those are implemented in Theano.
Since this is specific enough to your use case, I’m going to close this and perhaps you can email me directly if you have more questions about this one? I’ll set up a mailing list one of these days 😃
Indeed; I did that before Sivaram developed the faster method, and I didn’t think I could optimize the kernel parameters since I was applying it to an entire Kepler light curve, so the likelihood computation was quite slow.
Eric Agol Astronomy Professor University of Washington