ROC classification confusion matrix and assessment indicators of evaluation index] [/ AUC curve

Confusion Matrix Knowledge:

  1. Precision (precision): P r e c i s i O n = T P T P + F P Precision = \frac {TP}{TP + FP}
  2. Recall (recall): R e c a l l = T P T P + F N Recall = \frac {TP}{TP + FN}
  3. F1-Score (P and R of the harmonic mean): F 1 = 2 1 P r e c i s o n + 1 R e c a l l F1 = \frac {2}{\frac {1}{Precison} + \frac {1}{Recall}}
  4. To help understand the confusion matrix diagram (Case This figure reflects the recognition of handwritten numeral 5):Here Insert Picture Description

ROC / AUC curve:

  1. The curve is a common binary model evaluation criteria, even more than the usual confusion matrix related indicators.
  2. The horizontal axis represents the graph FPR (False Positive Rate), the vertical axis represents the TPR (True Positive Rate).
  3. Graph showing ROC (receiver operating characteristic), AUC is the area under the curve area value.
  4. Generally, the AUC area close to 1 as possible (closer to the upper left of the ROC curve), but close to 0.5 worse (diagonal of the ROC curve of FIG closer)
  5. ROC / AUC curve schematic:Here Insert Picture Description

See related python code that implements a special template article.

Published 40 original articles · won praise 0 · Views 1707

Guess you like

Origin blog.csdn.net/weixin_44680262/article/details/104684059