has_aux for forward-mode differentiation functions
See original GitHub issueRelated to PR https://github.com/google/jax/pull/484
We should have corresponding has_aux
implementation of passing side-information for the forward-mode functions like jvp
, jacfwd
etc.
My use case is under a scenario that forward-mode is much more efficient than the reverse-mode.
Issue Analytics
- State:
- Created 5 years ago
- Comments:7 (5 by maintainers)
Top Results From Across the Web
Forward Mode Automatic Differentiation & Dual Numbers
While forward mode AD computes the derivative at the same time as the variable evaluation, backprop does so in the separate backward phase. ......
Read more >Forward-Mode Differentiation of Maxwell's Equations - arXiv
We present a previously unexplored forward-mode differentiation method for Maxwell's equations, with applications in the field of sensitivity ...
Read more >3.4 Automatic Differentiation - the forward mode
Since our variable w (our MyTuple object) keeps track of both the function and derivative values, all we need to do in order...
Read more >Automatic Differentiation: Forward and Reverse - Jingnan Shi
In the rest of this post, I introduce the definition of evaluation traces, and the two modes of AD: forward and reverse.
Read more >Unraveling Automatic Differentiation | Analytics Vidhya - Medium
Most ML/DL libraries use AD to calculate gradients and derivatives. It has two modes: Forward mode and Reverse mode.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
It would indeed be very useful if
jacfwd
andjacrev
had thehas_aux
optionCan we bump this up?