question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

jacobians and hessians for an application involving complex numbers

See original GitHub issue

The imaginary components of the derivatives of the following complex-valued functions are always zero in jax. In contrast, tensorflow returns a non zero imaginary component. I think this is a bug on the JAX side of things.

from jax.config import config; config.update("jax_enable_x64", True)
import jax
import jax.numpy as np
import numpy as onp

import tensorflow as tf

zs = 0.5j * np.arange(5) + np.arange(5)

print("input", zs)


def fn(z):
    return np.cos(np.linalg.norm(z*2))

grad = jax.jacfwd(fn)
print("jax", fn(zs), grad(zs))

def tf_fn(z):
    return tf.cos(tf.norm(z*2))

tf_zs = tf.convert_to_tensor(0.5j * onp.arange(5) + onp.arange(5))
tf_res = tf_fn(tf_zs)

sess = tf.Session()


grad_ys = tf.ones_like(tf_res)
grad_op = tf.gradients(tf_res, tf_zs, grad_ys=grad_ys)
print("tf", sess.run([tf_res, grad_op, grad_ys]))
input [0.+0.j  1.+0.5j 2.+1.j  3.+1.5j 4.+2.j ]
jax 0.9495740004388323 [0.        +0.j 0.10240272+0.j 0.20480544+0.j 0.30720815+0.j
 0.40961087+0.j]
tf [(0.9495740004388323+0j), [array([0.        +0.j        , 0.10240272+0.05120136j,
       0.20480544+0.10240272j, 0.30720815+0.15360408j,
       0.40961087+0.20480544j])], (1+0j)]

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:29 (29 by maintainers)

github_iconTop GitHub Comments

1reaction
proteneercommented, Apr 18, 2019

Again, thanks for all the help on this issue!

1reaction
proteneercommented, Apr 15, 2019

Confirmed bug in tensorflow:

import numpy as onp
import autograd as ag
import autograd.numpy as anp
import numpy as onp
import tensorflow as tf

inp = anp.array(2.0)

print("input", inp)

def ag_fn(x):
    real = anp.cos(x+2)
    imag = anp.sin(x-1)
    return anp.abs(real+1j*imag)

ag_hess = ag.hessian(ag_fn)

print("ag val:", ag_fn(inp))
print("ag hess:", ag_hess(inp))

def tf_fn(x):
    real = tf.cos(x+2)
    imag = tf.sin(x-1)
    return tf.abs(tf.complex(real, imag))

# tf_inp = tf.convert_to_tensor(inp)
tf_inp = tf.placeholder(shape=tuple(), dtype=onp.float64)

out_op = tf_fn(tf_inp)

tf_grad = tf.gradients(out_op, tf_inp)[0]
tf_hess = tf.hessians(out_op, tf_inp)[0]

sess = tf.Session()
delta = 1e-7

_, d0, tf_ad = sess.run([out_op, tf_grad, tf_hess], feed_dict={tf_inp: inp})
_, d1, _ = sess.run([out_op, tf_grad, tf_hess], feed_dict={tf_inp: inp+delta})

print("tf_numerical derivative:", (d1-d0)/delta)
print("tf_autodiff derivative:", tf_ad)
input 2.0
ag val: 1.0655155566059393
ag hess: -0.25533014019223726
2019-04-14 22:55:43.481283: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
tf_numerical derivative: -0.25533013481293665
tf_autodiff derivative: -1.0655155566059389
Read more comments on GitHub >

github_iconTop Results From Across the Web

The Autodiff Cookbook - JAX documentation
A Hessian-vector product function can be useful in a truncated Newton Conjugate-Gradient algorithm ... JAX is great at complex numbers and differentiation.
Read more >
Complexity of Gradients, Jacobians, and Hessians
This is far from true for Jacobians and Hessians, whose cost is very hard to ... This process which typically involves rational numbers...
Read more >
Some Bounds on the Complexity of Gradients ... - CiteSeerX
Fortunately, much of the excellent research that has been conducted regarding the estimation of sparse Jacobians and Hessians by di erencing (see, e.g., ......
Read more >
Some Bounds on the Complexity of Gradients, Jacobians, and ...
The new separability concepts facilitate the reduction of chromatic numbers and maximal row lengths, which determine the complexity of the ...
Read more >
Jacobians and Hessians of Mean Value Coordinates for ...
In several applications though, it is desirable to enforce additional constraints involving the partial derivatives of the interpolated function, as done.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found