What is the information entropy

Entropy is a concept in physics, it refers to a system of quantitative indicators of the state of chaos. Size and degree of entropy chaotic systems are related, that is, the more chaotic system, the greater the entropy, on the contrary, is no different.

Entropy

Characterization of the information is not correct index. The uncertainty associated with the information, the greater the entropy, the greater the uncertainty information.

Entropy is how calculated?

Related to the content of information theory, that is, information coding problems. All information to indicate completion, totaling the number of coding? Or need a binary digit.
The theoretical Shannon is:
\ [L (X) = log_2 (. 1 / P (X)) \]
L binary bits needed to represent, p (x) represents the probability of occurrence.
By this formula, it can be calculated result of a probability bins required.
Know the number of binary bits, then the average code length information is how much
\ [H (x) = \ sum_ {x} (p (x) * L (x)) \]
is an optimal code length.

references

  1. Getting Started tutorial information theory
  2. markdown official display

Guess you like

Origin www.cnblogs.com/khldragon/p/shang-shi-shen-me-xin-xi-shang-shi-shen-me.html