Improve binary focal loss to trade off recall and precision by adding weights to positive examples like torch.nn.BCEWithLogitsLoss
See original GitHub issue🚀 Feature
Improve binary focal loss to trade off recall and precision by adding weights to positive examples. Just like pos_weight
parameter in torch.nn.BCEWithLogitsLoss. pos_weight
must be a vector with length equal to the number of classes.
Issue Analytics
- State:
- Created a year ago
- Comments:14 (9 by maintainers)
Top Results From Across the Web
BCEWithLogitsLoss — PyTorch 1.13 documentation
It's possible to trade off recall and precision by adding weights to positive examples. In the case of multi-label classification the loss can...
Read more >Machine Learning - | notebook.community
This task is treated as $C$ different binary and independent ... It's possible to trade off recall and precision by adding weights to...
Read more >PyTorch Loss Functions: The Ultimate Guide - neptune.ai
How to add PyTorch loss functions? PyTorch's torch.nn module has multiple standard loss functions that you can use in your project.
Read more >pytorch BCEWithLogitsLoss calculating pos_weight
Apart from describing Focal loss, this paper provides a very good explanation as to why CE loss performs so poorly in the case...
Read more >DeepEthogram, a machine learning pipeline for ... - eLife
In addition, this approach requires only a single model for behavior classification instead of models for pose estimation and behavior ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
you can also check our less graphical contributor guides: https://github.com/kornia/kornia/blob/master/CONTRIBUTING.rst#developing-kornia
It doesn’t matter. Just go ahead.