question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Question] Make predicitions about derivatives (mean *and* variance)

See original GitHub issue

The example: gpytorch/examples/08_Advanced_Usage/Simple_GP_Regression_Derivative_Information_1d.ipynb shows us how to make predictions about derivatives, and also to make inference based on derivative information.

What is the best way to go about making predictions about derivatives when derivative information isn’t available? In other words I would like to be able to not only draw samples from the trained GP but also the first derivatives.

I tried passing gpytorch.distributions.MultivariateNormal the first gradient block (K[..., :n1, n2:]) returned from gpytorch.kernels.RBFKernelGrad but without success.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
KeAWangcommented, Apr 14, 2020

Is #772 what you’re looking for?

0reactions
KeAWangcommented, Apr 19, 2020

The easiest way to get the variance is to do an empirical estimate from sampling the derivatives.

Modifying the same code as https://github.com/cornellius-gp/gpytorch/issues/772#issuecomment-508124687:

# Get into evaluation (predictive posterior) mode
model.eval()
likelihood.eval()

# Test points are regularly spaced along [0,1]
# Make predictions by feeding model through likelihood
with gpytorch.settings.fast_pred_var():
    test_x = torch.linspace(0, 1, 51, requires_grad=True)
    observed_pred = likelihood(model(test_x))
    dydtest_x = torch.autograd.grad(observed_pred.mean.sum(), test_x, retain_graph=True)[0]

    n_samples = 100
    sampled_pred = observed_pred.rsample(torch.Size([n_samples]))
    sampled_dydtest_x = torch.stack([torch.autograd.grad(pred.sum(), test_x, retain_graph=True)[0] for pred in sampled_pred])
    dydtest_x_sample_std = sampled_dydtest_x.std(0)
    lower = dydtest_x - 2 * dydtest_x_sample_std
    upper = dydtest_x + 2 * dydtest_x_sample_std
    

with torch.no_grad():
    # Initialize plot
    f, ax = plt.subplots(1, 1, figsize=(4, 3))

    # Plot training data as black stars
    ax.plot(train_x.numpy(), train_y.numpy(), 'k*')
    # Plot predictive means as blue line
    ax.plot(test_x.detach().numpy(), observed_pred.mean.detach().numpy(), 'b')
    # Plot expected derivative
    ax.plot(test_x.detach().numpy(), dydtest_x.numpy(), 'r')
    # Plot 2 times empirical standard deviations about expected derivative
    ax.fill_between(test_x.detach().numpy(), lower.detach().numpy(), upper.detach().numpy(), alpha=0.1, color="r")
    # Plot real derivative
    ax.plot(test_x.detach().numpy(), 2 * math.pi * torch.cos(2 * math.pi * test_x).detach().numpy(), 'g')
    ax.set_ylim([-7, 7])
    ax.legend(['Observed Data', 'Mean',  'Estimated Derivative', 'True Derivative'])

This gives image

Read more comments on GitHub >

github_iconTop Results From Across the Web

Volatility Derivatives – Variance and Volatility Swaps
Our purpose for this thesis is to do an extensive study in the financial area known as volatility derivatives, and apply some of...
Read more >
Week 5: Simple Linear Regression
problem. 1. Take partial derivatives of S with respect to b0 and b1. ... What assumptions did we make to prove that the...
Read more >
Time Series Analysis for Business Forecasting
Decision-making involves the selection of a course of action (means) in pursue of the ... provided the time series is stationary in both...
Read more >
Simple Linear Regression
Y |x? Homoscedasticity: We assume the variance (amount of variability) of the distribution of Y values to be the same at each different...
Read more >
Monte Carlo Simulation: History, How it Works, and 4 Key Steps
A Monte Carlo simulation is a model used to predict the probability of a variety of outcomes when the potential for random variables...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found