Target Detection Evaluation Criteria

Evaluation of the effect of the value of training - accuracy (precision), recall (recall), accuracy (accuracy), in addition to pay and (IoU)

 

TP is predicted positive samples are positive sample

FP is negative samples predicted positive samples

FN are positive and false negative samples considered

TN are negative, negative samples believed to be correct

In recognition precision is out of the picture (predictive picture is positive sample is identified pictures), the ratio of the share of TP:

precision=TP/(TP+FP)

recall is to be recognized correctly with the number of positive samples in the test set the ratio of the number of all positive samples:

recall=TP/(TP+FN)

accuracy=(TP+TN)/(TP+FP+FN+TN)

IoU coincidence degree system is predicted from the picture frame with the original tag box. I.e., the detection result of Detection Result Ground Truth than the intersection of their union, that is, detection accuracy: 

IoU = (DetectionResult⋂GroundTruth) / (DetectionResult⋃GroundTruth)

mAP detection of many types, taking the average for each class AP. AP (average precision) is the area under the curve, where average tantamount recall averaged. The mAP (mean average precision) of mean, is averaged for all categories.

Guess you like

Origin www.cnblogs.com/yumoye/p/11022804.html