Learning from information - gradually emerging

The entropy of N times extended source

The entropy of the N expansion wish X^N of the discrete memoryless source X is equal to N times the entropy of the source X, that is, H(X^N)=NH(X)

Average Symbol Entropy

Indicates the average amount of information carried by each symbol in the output symbol sequence of N-long sources (that is, the average amount of information provided by each symbol transmitted by N-long sources), which is used to evaluate the information provided by each symbol transmitted by discrete stationary memory sources. average amount of information .

1. The conditional entropy of each dimension H(XN|X1X2X3...XN-1) is non-increasing with the increase of N

2. When N is given, the average symbol entropy is greater than or equal to the conditional entropy

3. The average symbol entropy is non-increasing as N increases

limit entropy

The source output is an N-long symbol sequence. When N——>∞, the limit entropy H(∞) is

Limit information (entropy rate)

That is, as a discrete stationary memory source X=X 1 X 2 ... X N provides a measure of the ability to provide information.

Meaning: For discrete stationary sources, when N→∞ (that is, when the dependency is infinitely long), both the average symbolic entropy and the conditional entropy tend to the information entropy (limit entropy) of the stationary source in a non-incrementally consistent manner .

In practice, the conditional entropy under finite N is often taken as an approximation of H∞. Because when N is not very large, a value very close to H∞ can be obtained, usually N=7,8,9

When the memory length of stationary sources is limited, such as Markov sources. Let the memory length be m (that is, what symbols are sent out at a certain time are only related to the first m symbols), then the limit entropy of the discrete stationary information is equal to the conditional entropy of the finite memory length m

Guess you like

Origin blog.csdn.net/yyfloveqcw/article/details/124273069