Confusion matrix and various evaluation indicators in deep learning

confusion matrix 

TP: True positive  is judged as a positive sample (Positive) and is actually a positive sample (True).

FP: False Postivate  is judged as a positive sample (Positive). In fact, it is a negative sample (False).

FN: False Negative A sample judged to be negative is actually a positive sample (False).

TN: True Negative is judged as a negative sample (negative) but is actually a negative sample (True)

Positive represents a positive sample, True represents a correct judgment, and False represents an incorrect judgment.

Negative represents a negative sample. True represents a correct judgment, and False represents an incorrect judgment.

Evaluation index

Precision Accuracy rate - also known as precision rate, indicates the probability of correct prediction of positive samples in the prediction results .

Recall - also known as recall rate, represents the probability of correct prediction in the original positive sample

Accuracy:                  Precision=\frac{TP}{TP+FP}                        (correct samples/all samples judged to be positive)

Recall:                 Recall=\frac{TP}{TP+FN}                              (correct samples/all positive samples)

F1-Score:                F1-Score=\frac{2*Precision*Recall}{Precision+Recall}

Guess you like

Origin blog.csdn.net/weixin_43852823/article/details/128531937