KL divergence (KL divergence, Kullback-Leibler divergence) Definition: A measure describing the similarity of two probability distributions Q(x) and P(x), denoted as D(Q||P) .
(1) For discrete random variables, the KL divergence is defined as
(2) For continuous random variables, the KL divergence is defined as
It is easy to prove that the KL divergence has the property: . If and only if Q=P, .
In fact, using Jense's inequality:
KL divergence is asymmetric and also satisfies the triangle inequality, not a distance measure in the strict sense