question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

GlobalMutualInformationLoss with different bin distribution

See original GitHub issue

Is your feature request related to a problem? Please describe. The current implementation of GlobalMutualInformationLoss can only apply Parzen windowing with bins distributed evenly between 0 and 1 instead of customise bins according to the input distribution. This lead to two problems:

  1. The inputs (target and pred) must range between 0 and 1
  2. It is hard to benchmark this loss against the implementation by other packages (e.g antspyx)

Describe the solution you’d like Reimplement GlobalMutualInformationLoss.

Describe alternatives you’ve considered N/A

Additional context N/A

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:16 (4 by maintainers)

github_iconTop GitHub Comments

2reactions
kate-sann5100commented, Sep 9, 2021

Unknown-2

2reactions
kate-sann5100commented, Aug 25, 2021

The following steps are planned:

  1. Implement B-Spline kernel as an option, together with gaussian prob (our original implementation).
  2. Benchmark our implementation with B-Spline kernel with ants.image_mutual_information. This could be achieved either by getting the same result when using the same input or show changes in same directions as the input changes.
  3. Benchmark our implementation with Gaussian probability against B-Spline kernel by adjusting the sigma parameter.
Read more comments on GitHub >

github_iconTop Results From Across the Web

DRMIME: Differentiable Mutual Information and Matrix ... - arXiv
Abstract—In this work, we present a novel unsupervised image registration algorithm. It is differentiable end-to-end and can be.
Read more >
Estimation of Entropy and Mutual Information
For each value of N, our problem reduces to estimating I(SN,TN), where the joint distribution of the random variables. SN and TN is...
Read more >
Mutual Information Loss in Pyramidal Image Processing - MDPI
We show that this relationship holds for a wide variety of probability distributions and present several examples of analyzing Gaussian and Laplacian pyramids ......
Read more >
Distribution of Mutual Information
The mutual information of two random variables z and J with joint probabilities {7rij} is commonly used in learning Bayesian nets as well...
Read more >
A new histogram-based estimation technique of entropy and ...
Mutual Information (MI) has extensively been used as a measure of similarity or dependence between random variables (or parameters) in different signal and ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found