Understanding of KL-divergence (KL divergence)
Relative entropy, also known as Kullback-Leibler divergence or information divergence, is an asymmetric measure of the difference between two probability distributions. In information theory, relative entropy is equivalent to the difference in information entropy (Shannon entropy) of two probability distributions.
Reference: The meaning and properties of KL divergence