Classification evaluation indicators in simple machine learning evaluation indicators (acc, recall, precision, F1, fusion matrix,

3 Precision and representation recall

Logistic regression (model): output probability, <0.5 is a negative example

(1) Acc official:

(2) Precision (accuracy rate ), precision rate formula:

Among all the examples you judge as positive examples , how many are really positive examples

Precision (accuracy): the number of samples correctly predicted positive classification / the number of samples predicted as positive samples 

The proportion of the positive samples that you actually predicted correctly accounted for all the positive samples is the proportion of the correct predictions.

precision = \frac{tp}{tp+fp}   

 

(3) Recall formula:

What do you think is right accounts for all the right proportions

Tradeoff will trade off precision and recall

(4) F1_score (comprehensive average of precision and recall [harmonic mean])

Both precision and recall can be taken into account. A higher F1_score indicates that precision and recall have reached a high balance point.

Guess you like

Origin blog.csdn.net/sinat_37574187/article/details/132290708