Commonly used evaluation indicators for knowledge maps: MRR, MR, HITS@K, Recall@K, Precision@K

1. MRR

The full name of MRR is Mean Reciprocal Ranking (reciprocal ranking), where Reciprocal means "reciprocal". 该指标越大越好(That is, the higher the predicted ranking, the larger the reciprocal, and the larger the summation result, the better). The specific calculation method is as follows:

M R R = 1 ∣ S ∣ ∑ i = 1 ∣ S ∣ 1 r a n k i = 1 ∣ S ∣ ( 1 r a n k 1 + 1 r a n k 2 + ⋅ ⋅ ⋅ + 1 r a n k i ) MRR=\frac{1}{|S|} \sum_{i=1}^{|S|} \frac{1}{rank_i}=\frac{1}{|S|}(\frac{1}{rank_1}+\frac{1}{rank_2}+\cdot\cdot\cdot+\frac{1}{rank_i }) MRR=S1i=1Sranki1=S1(rank11+rank21++ranki1)

where SSS is a set of triplets,∣ S ∣ |S|S is the number of triplet sets,ranki rank_irankirefers to the iiLink prediction ranking for i triples. For example, for the triplet (Jack, born_in, Italy), the result of link prediction might be:

h r t score rank
Jack born_in Ireland 0.789 1
Jack born_in Italy 0.753 2
Jack born_in Germany 0.695 3
Jack born_in China 0.456 4
Jack born_in Thomas 0.234 5

Then, the link prediction rank for the triplet (Jack, born_in, Italy) is 2.

2. MR

The full name of MR is Mean Rank. 该指标越小越好(The higher the ranking, the smaller the rank, and the smaller the sum), the specific calculation method is as follows:
MR = 1 ∣ S ∣ ∑ i = 1 ∣ S ∣ ranki = 1 ∣ S ∣ ( rank 1 + rank 2 + ⋅ ⋅ ⋅ + ranki ) MR=\frac{1}{|S|} \sum_{i=1}^{|S|} rank_i=\frac{1}{|S|}(rank_1+rank_2+\cdot\cdot \cdot+rank_i)MR=S1i=1Sranki=S1(rank1+rank2++ranki)

3. HITS@K

top-k recommendation: returns the top k results from the final recommendation list sorted by score. This indicator refers to the rank less than kk
in the link predictionThe average fraction of triplets in k . The specific calculation method is as follows:

H I T S @ K = 1 ∣ S ∣ ∑ i = 1 ∣ S ∣ I ( r a n k i ≤ k ) HITS@K=\frac{1}{|S|} \sum_{i=1}^{|S|} \mathbb{I}(rank_i\le k) HITS@K=S1i=1SI(rankik)

Among them, the symbols involved in the above formula are the same as those involved in the MRR calculation formula. In addition, I ( ⋅ ) \mathbb{I}(\cdot)I ( ) is the indicator function (if the condition is true, the function value is 1, otherwise it is 0). Generally, k is equal to 1, 3 or 10. The bigger the indicator, the better.

四、Recall@K,Precision@K

First of all, you need to understand the meaning of Recall and Precision. For the specific meaning, please refer to: Confusion Matrix of classification
; analogy to HITS@K.

Precision@K accuracy refers to the ratio of the number of relevant results retrieved in the previous topK results to the total number of retrieved results, which measures the precision of the retrieval system; Recall@K recall refers to the relevant results retrieved in the previous topK
results The ratio of the number of results to the number of all relevant results in the database measures the recall rate of the retrieval system.

reference link

  1. KGE performance indicators: MRR, MR, HITS@1, HITS@3, HITS@10
  2. Comprehension and example analysis of the evaluation standard recall rate Recall@K
  3. Knowledge Graph - Common Evaluation Indicators - MRR, MR and HIT@n
  4. Evaluation methods such as MRR and MAP (commonly used for IR and QA tasks)

Guess you like

Origin blog.csdn.net/l8947943/article/details/127842338