Adjoint Differentiation
See original GitHub issueWhile not really viable for NiSQ circuits (outside of simulation), the request has been made a few times to implement the adjoint gradient calculation method for differentiation. This would roughly entail:
- Writing a new C++ op to calculate gradients via the adjoint method
- Writing a new differentiator that uses this op inside of it and only defines this operation for
differentiate_analytic.
Going to leave this open for discussion for anyone who’s interested.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:2
- Comments:10 (1 by maintainers)
Top Results From Across the Web
Adjoint algorithmic differentiation (AAD) definition - Risk.net
Adjoint algorithmic differentiation is a mathematical technique used to significantly speed up the calculation of sensitivities of derivatives prices to ...
Read more >A brief introduction to Automatic Adjoint Differentiation (AAD)
AD computes many derivatives sensitivities very quickly. Nothing more, nothing less. So what is Adjoint Differentiation (AD, also called ...
Read more >Adjoint Differentiation — PennyLane documentation
So what have we learned? Adjoint differentiation is an efficient method for differentiating quantum circuits with state vector simulation. It ...
Read more >Automatic differentiation - Wikipedia
In mathematics and computer algebra, automatic differentiation (AD), also called algorithmic differentiation, computational differentiation, ...
Read more >Adjoints and automatic (algorithmic) differentiation in ... - arXiv
This paper gives an overview of adjoint and automatic differentiation (AD), also known as algorithmic differentiation, techniques to cal- culate ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

Oops, that was an initial misunderstanding of mine. Thank you for pointing it out @refraction-ray . We are now well on our way to finishing the adjoint differentiator 😃
You can also refer to Section 3 “Reversible Computing and Automatic Differentiation” of the Yao.jl paper . This is the key technique that allows Yao.jl to train a, say, 10000 layer variational circuit on a laptop.