question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Should Detector IOU Threshold be same as IOU threshold for mAP?

See original GitHub issue

Hi, I have a question regarding the detector iou threshold and mAP iou threshold and I think it will be a simple question or just a clarification. I am using yolov3 detector with various confidence threshold and iou threshold and trying to get mAP for each detector configuration. My goal here is I am doing grid search of confidence threshold and iou threshold values based on maximizing mAP. My question here is should I set iou threshold value (default as MINOVERLAP = 0.5) same as the detector threshold?

For example, I run detector with iou threshold 0.1, 0.3, 0.5, 0.7, then should the iou threshold be same accordingly as 0.1, 0.3, 0.5, 0.7 in the variable MINOVERLAP or does it have to be 0.5 to evaluate the detector performance for various iou threshold.

My assumption here is I should stick with 0.5 since I am measuring the detector performance for various iou threshold but decreasing MINOVERLAP means mAP will be higher, which negate the meaning of assesing the performance on various detector iou threshold.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
janedoesrepocommented, Mar 28, 2021

You’re right. Detector iou is Independent of minoverlap. According to the pascal voc papers it is your responsibilty to filter the boxes before the evaluation.

0reactions
takehiro-codecommented, Jun 7, 2021

Thank you for the insight!

Read more comments on GitHub >

github_iconTop Results From Across the Web

IoU a better detection evaluation metric - Towards Data Science
Intersection over Union (IoU) is used when calculating mAP. ... this is repeated with 10 IoU thresholds from 0.5 to 0.95 and averaged....
Read more >
Mean Average Precision (mAP) Explained: Everything You ...
How to correctly calculate mAP? Average Precision is calculated as the weighted mean of precisions at each threshold; the weight is the increase...
Read more >
mAP (mean Average Precision) for Object Detection
In some datasets, we predefine an IoU threshold (say 0.5) in classifying whether the prediction is a true positive or a false positive....
Read more >
Mean Average Precision (mAP) Explained - Paperspace Blog
Usually, the object detection models are evaluated with different IoU thresholds where each threshold may give different predictions from the other thresholds.
Read more >
What is the difference between these error metrics in object ...
The problem with a single IoU threshold is that it means that two predictions of IoU 0.6 and 0.9 would have equal weightage....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found