question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Time-efficient higher-order forward mode?

See original GitHub issue

Have you guys given any thought to how to efficiently compute higher-order derivatives of scalar-input functions? I have a use for 4th- and 5th-order derivatives of a scalar-input, vector-output function, namely, regularizing ODEs to be easy to solve.

I’m not sure, but I think that in principle, the Nth-order derivative of a scalar-input function can be computed for about only N times the cost of evaluating the original function, if intermediate values are cached. We think we have a partial solution using https://github.com/JuliaDiff/TaylorSeries.jl, but I’d rather do it in JAX.

The following toy example takes time exponential in the order of the derivative:

from jax import jvp
import jax.numpy as np

def fwd_deriv(f):
  def df(x):
    return jvp(f, (x,), (1.0,))[1]
  return df

def f(t):
    return 0.3 * np.sin(t) * t**10

g = f
for i in range(10):
    g = fwd_deriv(g)
    print(g(1.1))

Is there a simple way to set things up with jvps and vjps to be more time-efficient, or do you think it would require a different type of autodiff entirely?

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:2
  • Comments:18 (18 by maintainers)

github_iconTop GitHub Comments

2reactions
jessebettcommented, Mar 18, 2020

This was definitely a bit of a rabbit hole we stumbled into!

1reaction
mattjjcommented, Mar 21, 2019

We discussed this a bit in our chat. We think the answer is to add a CSE pass that happens after every level of differentiation. But we might also need some other simplifications, like collecting terms x + x + x = 3x.

CSE is easy enough to add. We’ll try it out and report back!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Forward Mode Automatic Differentiation & Dual Numbers
Reverse mode: Take the result and apply backpropagation to it: ∇²fv = Hv. Smart, right? The Hessian can then be used to do...
Read more >
Provably Correct, Asymptotically Efficient, Higher-Order ...
Forward and reverse mode are defined in Section 3.2. (2) Higher order. It is capable of differentiating a fully higher-order language with first-class...
Read more >
Continuous-Time Meta-Learning with Forward Mode ... - arXiv
Importantly, in order to compute the exact meta-gradients required for the outer-loop updates, we devise an efficient algorithm based on forward.
Read more >
Lazy Multivariate Higher-Order Forward-Mode AD
Lazy Multivariate Higher-Order Forward-Mode AD. Barak A. Pearlmutter ... of E from (1) so that E εi p yields i! times the coefficient...
Read more >
Provably Correct, Asymptotically Efficient, Higher-Order ...
Forward and reverse mode are defined in Section 3.2. (2) Higher order. It is capable of differentiating a fully higher-order language with first-class...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found