question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Discussion of weighting of generalized Dice loss

See original GitHub issue

Hi,

I would like to discuss this snippet in dice.py.

https://github.com/Project-MONAI/MONAI/blob/ad06dff7f85711048690b2e85c99d51001612708/monai/losses/dice.py#L189-L193

I could understand that you don’t want any weight to be inf. However, is setting them to the maximum non-infinite value a good idea?

My intuition is Dice loss is designed to emphasize the scarce foregrounds. The weights become infinity when no foreground is present, which means the associated gradients are going to teach the network how to predict backgrounds. And you assign a large weight to it? I think it is weird.

Any idea? Thanks a lot.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
cckaocommented, May 12, 2020

They are useful features. I will definitely try them. Thanks for the suggestion.

0reactions
wylicommented, May 12, 2020

maybe it’s also helpful to set include_background=False or consider other components from monai such as "cropforeground’ https://github.com/Project-MONAI/MONAI/blob/master/monai/transforms/croppad/array.py#L155 and balanced sampling https://github.com/Project-MONAI/MONAI/blob/master/monai/transforms/croppad/dictionary.py#L186

Read more comments on GitHub >

github_iconTop Results From Across the Web

Why are weights being used in (generalized) dice loss, and ...
The paper on generalized dice loss uses weights inversely proportional to labels area, in order to better predict labels with generally ...
Read more >
Generalised Dice overlap as a deep learning loss function for ...
In this work, we investigate the behavior of these loss functions and their sensitivity to learning rate tuning in the presence of different ......
Read more >
Unified Focal loss: Generalising Dice and cross entropy ...
The Focal loss is a variant of the binary cross entropy loss that addresses the issue of class imbalance with the standard cross...
Read more >
About Dice loss, Generalized Dice loss - PyTorch Forums
Hello All, I am running multi-label segmentation of 3D data(batch x classes x H x W x D). The target is 1-hot encoded[all...
Read more >
Loss Weightings for Improving Imbalanced Brain Structure ...
inverse of square frequency is known as generalized Dice loss function [24]. 2.4.2. Inverse Median Frequency Weighting.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found