question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

an approximate solution to compute mutual information

See original GitHub issue

@adalca @balakg Hi,I find an approximate solution to compute mutual information in tensorflow, but I cannot test its rightness as I donot have enough data to train the model, can anyone help ? code is here:

def nmi_gaussian(R, T, win=20, eps=1e-5):
    '''
        Parzen window approximation of mutual information
    Params:
        R : Reference(Fixed) Image, shape should be N * H * W * Z * 1
        T: Test (Moving) Image, shape should be the same as R
        win: number of bins used in histogram counting
    '''
    N, H, W, Z, C = R.shape
    assert C == 1, 'image should be only one channel'
    im_size = N.value * H.value * W.value * Z.value

    R_min = tf.reduce_min(R, keep_dims=False)
    R_max = tf.reduce_max(R, keep_dims=False)
    T_min = tf.reduce_min(T, keep_dims=False)
    T_max = tf.reduce_max(T, keep_dims=False)

    R_bin_size = (R_max - R_min) / win
    T_bin_size = (T_max - T_min) / win

    # compute bins
    R_bin_window = tf.range(R_min + 0.5 * R_bin_size, R_min + 0.5 * R_bin_size + R_bin_size * win - eps, delta=R_bin_size)
    T_bin_window = tf.range(T_min + 0.5 * T_bin_size, T_min + 0.5 * T_bin_size + T_bin_size * win - eps, delta=T_bin_size)

    R_mesh = tf.tile(tf.reshape(R_bin_window, (-1, 1)), multiples=[1, win])
    T_mesh = tf.tile(tf.reshape(T_bin_window, (1, -1)), multiples=[win, 1])
    R_T_mesh = tf.concat([tf.reshape(R_mesh, (-1, 1)), tf.reshape(T_mesh, (-1, 1))], axis=-1)
    R_T_mesh = R_T_mesh[tf.newaxis, tf.newaxis, tf.newaxis, :, :]

    p_l_k = 1/(np.sqrt(2 * np.pi)) * tf.exp(-0.5 * (tf.square((R - R_T_mesh[..., 0])/R_bin_size) + tf.square((T - R_T_mesh[..., 1])/T_bin_size)))
    
    p_l_k = tf.reduce_sum(p_l_k, axis=(0, 1, 2, 3)) / im_size
    p_l_k = p_l_k / tf.reduce_sum(p_l_k)
    p_l_k = tf.reshape(p_l_k, (win, win))
    p_l = tf.reduce_sum(p_l_k, axis=0)
    p_k = tf.reduce_sum(p_l_k, axis=1)

    pl_pk = p_l[:, tf.newaxis] * p_k[tf.newaxis, :]

    mi = p_l_k * tf.log(p_l_k / pl_pk)

    mi = tf.where(tf.is_finite(mi), mi, tf.zeros_like(mi))
    mi = -tf.reduce_sum(mi)
    return mi

the idea is using Parzen Window estimation of MutualInformation, as of which used in MattesMutualInformation of ITK, but replace the B-Spline window function with a Gaussian window function

_Originally posted by @argman in https://github.com/voxelmorph/voxelmorph/issues/25#issuecomment-477512229_

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:24

github_iconTop GitHub Comments

2reactions
adalcacommented, Feb 9, 2020

@soanduong and everyone,

I’ve added an experimental NMI loss to losses.py, please take a look.

0reactions
adalcacommented, Mar 5, 2021

@lucasestini95 the mutual information is available directly in the github code.

We have one implemented in voxelmorph: https://github.com/voxelmorph/voxelmorph/blob/dev/voxelmorph/tf/losses.py

We also have a newer, cleaner one implemented in neurite which will eventually take over: https://github.com/adalca/neurite/blob/dev/neurite/tf/losses.py

Read more comments on GitHub >

github_iconTop Results From Across the Web

Efficient Approximate Solutions to Mutual Information Based ...
Mutual Information(MI) between two random variables X and Y , is a measure of information shared between them and is represented as I(X;...
Read more >
Estimation of Entropy and Mutual Information
We present some new results on the nonparametric estimation of entropy and mutual information. First, we use an exact local expansion of the....
Read more >
A Quick and Easy Way to Estimate Entropy and Mutual ...
Direct Calculation of Entropy Rate and Mutual Information Rate via the Direct Method and Quadratic Extrapolations.
Read more >
On Mutual Information Estimation - YouTube
Slides: https://slides.com/asobolev/on- mutual - information -estimation#/Artem Sobolev, Samsung AI Center Moscow, Research ScientistMutual ...
Read more >
Approximations of Shannon Mutual Information for ... - NCBI
We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found