question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Precision Metric: must have at least one example before it can be computed

See original GitHub issue

🐛 Bug description

When using the precision metric and the model output does not predict any positives (neither true or false), then Ignite throws an error:

ignite.exceptions.NotComputableError: Precision must have at least one example before it can be computed.

I guess this is expected because you can not compute the precision without any positives. But you would usually just apply an epsilon to the denominator and it is computable and you even do that in the code. But then I don’t understand why to throw an error when positives are empty in the first place?

Maybe I am wrong here, but my intuition is that this check can just be removed.

Environment

  • PyTorch Version: 1.8.1
  • Ignite Version: 0.4.4
  • OS: Linux
  • How you installed Ignite: pip
  • Python version: 3.9

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:15 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
vfdev-5commented, May 20, 2021

Thanks a lot for sharing ! Yes, this looks like a bug…

1reaction
liebknecommented, May 20, 2021

@vfdev-5

Does this make sense?


In [486]: p = Precision()

In [487]: p.update((torch.zeros(4), torch.randint(0, 2, (4,))))

In [488]: p.compute()
---------------------------------------------------------------------------
NotComputableError                        Traceback (most recent call last)
<ipython-input-488-4dea7f410252> in <module>
----> 1 p.compute()

~/.direnv/python-3.8.7/lib/python3.8/site-packages/ignite/metrics/precision.py in compute(self)
     42         is_scalar = not isinstance(self._positives, torch.Tensor) or self._positives.ndim == 0
     43         if is_scalar and self._positives == 0:
---> 44             raise NotComputableError(
     45                 f"{self.__class__.__name__} must have at least one example before it can be computed."
     46             )

NotComputableError: Precision must have at least one example before it can be computed.

If I update the metric by y_pred with torch.ones(4), then no error raises.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Source code for ignite.metrics.precision - PyTorch
__name__} must have at least one example before it can be computed." ) if not self._is_reduced: self._numerator = idist.all_reduce(self.
Read more >
A Look at Precision, Recall, and F1-Score | by Teemu Kanstrén
To see what is the F1-score if precision equals recall, we can calculate F1-scores for each point 0.01 to 1.0, with precision =...
Read more >
How to Calculate Precision, Recall, and F-Measure for ...
In an imbalanced classification problem with two classes, precision is calculated as the number of true positives divided by the total number of...
Read more >
Classification: Precision and Recall | Machine Learning
A number line from 0 to 1.0 on which 30 examples have been placed. Figure 1. Classifying email messages as spam or not...
Read more >
Precision — PyTorch-Metrics 0.11.0 documentation
Should be one of the following: global : Additional dimensions are flatted along the batch dimension. samplewise : Statistic will be calculated ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found