question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Make optional argmax for y_pred in Confusion Matrix, Precision, Recall, Accuracy

See original GitHub issue

🚀 Feature

Today, the conditions on the input of the Confusion Matrix, (and Precision, Recall, Accuracy in multiclass case) are the following:

    - `y_pred` must contain logits and has the following shape (batch_size, num_categories, ...)
    - `y` should have the following shape (batch_size, ...) and contains ground-truth class indices
        with or without the background class. During the computation, argmax of `y_pred` is taken to determine predicted classes.

Taking argmax on y_pred can be an option if we would like to determine winning class by some other rule. Let’s keep argmax as default behaviour if y_pred is (N, C, ...) and do not apply it if y_pred.shape == y.shape and (N, ...).

Issue Analytics

  • State:open
  • Created 4 years ago
  • Comments:9 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
sdesroziscommented, Mar 21, 2020

Ok I do it soon

1reaction
vfdev-5commented, Mar 21, 2020

argmax should be optional and user should be able to give their own rule.

If y_pred has C dimension like (N, C, ...) there is no way to compute a metric without taking argmax with y_true as (N, ...). In this case we should take argmax without an option, IMO.

if y_pred.shape == y.shape and (N, …), do not apply

yes. In this case, user can perform winning class selection in output_transform or anywhere before metrics update.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Calculate Precision, Recall, F1, and More for Deep ...
How can I calculate the F1-score or confusion matrix for my model? In this tutorial, you will discover how to calculate metrics to...
Read more >
deep learning RestNet problem calculate confusion matrix ...
i am new to deep learning. I am running a code to train and test a model and find its precision recall f1-score...
Read more >
Performance Metrics: Confusion matrix, Precision, Recall, and ...
The confusion matrix, precision, recall, and F1 score gives better intuition of prediction results as compared to accuracy.
Read more >
sklearn.metrics.classification_report — scikit-learn 1.2.0 ...
Compute precision, recall, F-measure and support for each class. confusion_matrix. Compute confusion matrix to evaluate the accuracy of a classification.
Read more >
Source code for ignite.metrics.confusion_matrix - PyTorch
During the computation, argmax of `y_pred` is taken to determine ... If `average="recall"` then confusion matrix values are normalized such ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found