Analysis of YoloV5 related performance indicators

Analysis of YoloV5 related performance indicators

1. Precision (precision rate/precision rate)

The proportion of correct predictions among all results that are predicted to be positive samples.

Precision = TP / (TP + FP)

2. Recall (recall rate/recall rate)

The proportion of all positive samples that are correctly predicted.

Recall = TP / (TP + FN)
Positive sample negative sample
Prediction is positive True Positive(TP) False Positive(FP)
Prediction is negative False Negative(FN) True Negative(TN)

3. PR curve (Precision-Recall)

That is, a curve composed of Recall as the abscissa and Precision as the ordinate.
Insert image description here

4. AP (Average Precision: Area under the PR curve)

AP: average accuracy
(1) before VOC2010

AP = 1/11 ∑ Max(p(r))    r∈{
    
    0,0.1,...,1} 
r:召回率
Max(p(r)):在r点的最大precision值
(Recall >= r时,选取Recall对应的precision的最大值作为在r点的precision)

(2)After VOC2010

AP = 1/n ∑ Max(p(r(k)))*(r(k)-r(k-1))    r∈{
    
    0,r(0),r(1),...,r(k),1} 
r(k):第k大的召回率
Max(p(r(k))):在r点的最大precision值
(Recall >= r(k)时,选取Recall对应的precision的最大值作为在r点的precision)

5,mAP(mean Average Precision)

mAP: the average AP of each category

mAP = 1/m ∑AP(i)  i∈[0,m),i∈N+
m:类别数
AP(i):第i类类别的平均精度

5.1,IoU(Intersection over Union)

IoU is also called the intersection and union ratio. It is a metric for evaluating the correctness of the bounding box. It represents the ratio of the intersection and union of the detection box (detection box) and ground truth (real label).

5.1,[email protected](IoU=0.5)

TP: The number of detection frames with IoU>0.5 (the same GT is only calculated once)
FP: The number of detection frames with IoU<=0.5, or the number of redundant detection frames that detect the same GT.
Therefore, Precision and Recall can be expressed as:

Precision = TP / all detection boxes
Recall = TP / all ground truths

5.2,[email protected]:0.95

Represents the average mAP at different IoU thresholds (from 0.5 to 0.95, step size 0.05).
(0.5, 0.55, 0.6, 0.65, 0.7, 0.75, 0.8, 0.85, 0.9, 0.95)

6,F1-score

F1-score = 2(Precision × Recall )/(Precision + Recall)

7,GIoU loss/ BECWithLogits loss

7.1 GIoU loss

Calculate the predicted bounding box loss and compare the predicted bounding box with the real bounding box.
Insert image description here
C represents the smallest box that can enclose the area of ​​(AUB), where C \ (AUB) represents the area of ​​the C box minus the area of ​​(AUB).

7.2 BECWithLogits loss

Calculate the loss of objectness score and class probability score, combining Sigmiod and BCELoss functions.

BCEWithLogitsLoss = Sigmoid + BCELoss

The calculation formula is:
Insert image description here

Guess you like

Origin blog.csdn.net/m0_47026232/article/details/119477826