A Summary of Discrete Sources of Information Theory and Coding

content

Foreword:

1. Mathematical model and classification of information sources

1. Mathematical model

2. Classification

2. N-fold expansion of discrete memoryless sources

1. Mathematical model

3. Discrete stationary source (emphasis: definition, joint entropy, conditional entropy, average symbolic entropy, limit entropy)

1. Discrete stationary source

4. Entropy of two-dimensional discrete stationary source (emphasis: definition, calculation of various entropy)

 1. Mathematical model of two-dimensional stationary memory source X=X1X2:

2. The joint entropy of X=X1X2

3. Post-entropy non-increasing principle

4. Average Symbolic Entropy

 5. Calculation of two-dimensional stationary source H(X1X2), H(X2|X1)

5. Limit entropy of discrete stationary sources 

1. The source space of N-dimensional discrete stationary sources​

2. Joint entropy of N-dimensional discrete stationary source with memory (emphasis: the expression and meaning of the chain rule of entropy)

3. Properties of the entropy of discrete stationary sources (emphasis: mastering formulas, meanings, proofs) 

 six. Limiting Entropy of Discrete Stationary Sources (emphasis: Concepts, Calculations)

 Seven, the residual degree of the source (concept, calculation)


Foreword:

Four basic concepts :

A graph : Vera graph

A basic principle : the process of obtaining information is the process of reducing uncertainty

1. Mathematical model and classification of information sources

1. Mathematical model

 Single symbol source:

Multi-symbol sources:

2. Classification

 

2. N-fold expansion of discrete memoryless sources

1. Mathematical model

2. The entropy of the N times extended source:

 

3. Discrete stationary source ( emphasis: definition, joint entropy, conditional entropy, average symbolic entropy, limit entropy )

1. Discrete stationary source

one-dimensional stationary

When t=i, t=j (i, j are arbitrary integers)

two-dimensional stationary

discrete stationary source

When t=i, t=j (i, j are arbitrary integers)

or

4. Entropy of two-dimensional discrete stationary source ( emphasis: definition, calculation of various entropy )

 Two-dimensional stationary source: There are only dependencies between two adjacent symbols in the random sequence output by the source.

 1. Mathematical model of two-dimensional stationary memory source X=X 1 X 2 :

2. The joint entropy of X =X 1 X 2

Meaning: The average amount of information provided by a two-dimensional discrete stationary memory source X=X1X2 per message sent out is equal to the average amount of information provided by the discrete stationary memory source X at the first moment (ie, the initial moment) determined by the initial probability space. The average amount of information provided by one symbol, plus the conditional average amount of information provided by each symbol sent out at the second moment under the premise that the symbols sent by the source X at the first moment are known.

3. Post-entropy non-increasing principle

Meaning: The average amount of information provided by a two-dimensional discrete stationary source with memory X =X1X2 is always less than that provided by each message (2 symbols) sent out by a two-dimensional discrete stationary source with no memory X²=X1X2 every time a message (2 symbols ) is sent out ) provides the average amount of information.

4. Average Symbolic Entropy

Meaning: Indicates the average amount of information provided by each symbol sent by the N -long source.

Meaning: The average amount of information provided by the discrete stationary memory source X=X1X2 per one symbol must be less than the average amount of information provided by the discrete memoryless source X²=X1X2 per one symbol.

 5. Calculation of two-dimensional stationary source H(X1X2), H(X2|X1)

5. Limit entropy of discrete stationary sources 

1. The source space of N-dimensional discrete stationary sources

2. Joint entropy of N-dimensional discrete stationary source with memory (emphasis: the expression and meaning of the chain rule of entropy)

Meaning: The average amount of information provided by the N-dimensional discrete stationary memory source X=X1X2...XN every time a message (x1x2...xn) is sent is equal to that provided by the discrete stationary memory source X every time a symbol is sent out at the beginning. The average amount of information H(X 1 ) of , plus the average conditional confidence H(X 2 |X 1 ) provided by each symbol issued at the second moment under the premise that the symbols sent out at the initial moment are known ; plus the first 1. On the premise that the symbols sent at the second moment are known, the conditional entropy H(X 3 |X 1 X 2 ) provided by each symbol sent at the third moment, ... and finally add the first, second, ... (N- 1 ) The sum of the conditional entropy H(X N | X 1 X 2 .

3. Properties of the entropy of discrete stationary sources (emphasis: mastering formulas, meanings, proofs) 

The conditional entropy of each dimension H(X N |X 1 X 2 ... X N-1 ) is non-increasing with the increase of N

When N is given, the average symbol entropy is greater than or equal to the conditional entropy

The average symbol entropy is non-increasing as N increases, i.e.

 six. Limiting Entropy of Discrete Stationary Sources (emphasis: Concepts, Calculations)

Explanation: For discrete stationary sources, when N→∞ (that is, when the dependency is infinitely long), both the average symbolic entropy and the conditional entropy tend to the information entropy (limit entropy) of the stationary source in a non-incremental manner.

Significance: For a discrete stationary memory source X whose memory length N is long enough (N→∞), the average amount of information provided by each symbol sent out, that is, the limit entropy H∞ is equal to the limit value of the conditional entropy

Note: In practice, the conditional entropy under finite N is often taken as the approximate value of H∞, N=7,8,9. 

When the memory length of stationary sources is limited, such as Markov sources. Let the memory length be m. (The limit entropy will be calculated)

Limiting entropy of two-dimensional stationary source with memory X=X 1 X 2

 Seven, the residual degree of the source ( concept, calculation )

 Relative rate of entropy:

Source residual degree:

Guess you like

Origin blog.csdn.net/yyfloveqcw/article/details/124391342