confusion matrix
The confusion matrix, also known as the error matrix, is a standard format for expressing accuracy evaluation, expressed in the form of a matrix with n rows and n columns. Each column of the confusion matrix represents the predicted category, and the total number of each column indicates the number of data predicted as this category; each row represents the true attribution category of the data, and the total number of data in each row indicates the number of data instances of this category. The numerical value in each column indicates the number of real data that is predicted as that class, as an example in the following table.
predict |
|||||
class 1 |
Class 2 |
Class 3 |
sum |
||
reality |
class 1 |
43 |
5 |
2 |
50 |
Class 2 |
4 |
45 |
1 |
50 |
|
Class 3 |
0 |
1 |
49 |
50 |
|
sum |
47 |
51 |
52 |
150 |
precision:
Class 1: 43/47=0.9148
Class 2: 45/51=0.8823
Class 3: 49/52=0.9423
mean-precision=0.9131
recall:
Class 1: 43/50=0.86
Class 2: 45/50=0.9
Class 3: 49/50=0.98
mean-recall=0.9133
iou:
Class 1: 43/(43+7+4)=0.7962
Class 2: 45/(45+5+6)=0.8035
Class 3: 49/(49+3+1)=0.9245
mean-iou(Miou)=0.8414
Meow:
(iou(category 1)+iou(category 1)+iou(category 1))/3=0.8414
FWiou (Frequency Weight Intersection and Union Ratio):
w1*iou(category 1)+w2*iou(category 2)+w3*iou(category 3)
=(47/150)*0.7962+(51/150)*0.8035+(52/150)*0.9245
=0.8432
pixel acc(PA):
PA=(43+45+49)/150=0.9133
MPA:
MPA=(43/50+45/50+49/50)/3=0.9133
F1-score:
F1-score=2*precision*recall/(precision+recall)=0.9506
Kappa:
po=PA=(43+45+49)/150=0.8832
pe=(47*50+51*50+52*50)/150*150=0.3333
kappa = (po-pe)/(1-pe)=0.8248
producter acc(PA):
PA=precision=0.9131
Overall Acc(OA):
OA=po=0.8832
user acc(UA):
UA=recall=0.9133
Heddlen acc(HA):
HA = F1-score=0.9506
Short acc(SA):
SA=PA*UA / (PA+UA-PA*UA)=0.8402
Misclassification error:
Misclassification error=1-user precision=0.0867
Omission error:
Leakage error=1-drawing accuracy=0.0869