[Machine Learning] Understanding of KL-divergence (KL divergence)

Understanding of KL-divergence (KL divergence)

Relative entropy, also known as Kullback-Leibler divergence or information divergence, is an asymmetric measure of the difference between two probability distributions. In information theory, relative entropy is equivalent to the difference in information entropy (Shannon entropy) of two probability distributions.

Reference: The meaning and properties of KL divergence

Applications of KL Divergence

Reference: Advanced detailed explanation of KL divergence

Guess you like

Origin blog.csdn.net/weixin_43693967/article/details/127431032