Summary of Information Theory (1)

Allow some degree of distortion during transmission

Error probability is also related to encoding rules and decoding rules

The decoding rule is to find the maximum posterior probability and calculate the correct probability.

Then Pe=1-ΣP[F(yj)|yj]*p(yj), get the error probability

It can also be easily calculated, directly calculate its joint probability , and sum it up.

Only considering the coding and decoding rules , the impact on its accuracy is limited, and the key depends on its operation function

Explanation: Since the channel characteristics are described by the transition probability p(y|x), it is troublesome to determine the posterior probability p(x|y), so the maximum likelihood decoding rule is introduced .

Maximum likelihood decoding rules, pay attention to meet the input probability

Fano's inequality (Fano's inequality)

A suitable coding method can be selected to reduce the error probability

Simple repeated encoding, repeated multiple times for encoding, so that channel symbols become channel code words for transmission

The decoding rule of " majority decoding " is adopted, and it is determined according to the number of symbols

Using simple repetition coding, if the number of repetitions n is further increased, the average error probability will continue to decrease

When the number of repetitions n increases, the average error probability Pe decreases, and the information transmission rate R decreases.

 

M: Indicates the number of new source symbols after simple repetition coding, n: Indicates the code length (ie the number of repetitions).

R: Represents M source symbols ( new source symbols after simple repetition coding ), the maximum amount of information carried by each symbol is logM , and now n code symbols are used for transmission, and the information carried by each code symbol is averaged The amount is R.

ps: Simple repetition coding reduces the average error probability at the expense of reducing the information transmission rate R. n↑, Pe↓,R↓

Shannon's Second Theorem: Find a code that makes the average error probability sufficiently small, while the information transmission rate R remains at a certain level

Guess you like

Origin blog.csdn.net/yyfloveqcw/article/details/124336530