Learning in Information - Little Famous

Coding Rate and Coding Efficiency of Fixed Length Coding

Coding rate: R'=(l*logr)/N bit/source symbol

l*logr>N*H(S)

The former represents the maximum amount of information that a code symbol sequence of length l can carry

The latter represents the average amount of information carried by the source symbol sequence of length N

R '> = H (S) + ε

The encoding rate must be greater than the source entropy to achieve almost distortion-free encoding

Coding efficiency η=H(S)/R'=H(S)/((L/N)*logr)

η (max) = H (S) / (H (S) + ε )

ε = ((1 - η) / η) * H (S)

Obviously, when the decoding error probability is required, the source sequence length N must satisfy:

But in practice, to achieve distortion-free equal-length encoding, N is very, very large.

For the above reasons, a new encoding, variable length code , is introduced.

variable length code

Variable-length codes can produce highly efficient and distortion-free source codes when N is not very large.

The only decodable requirement: it must be a non- singular code , and its N-fold extension is also a non-singular code.

Requirements for instant code: no need to consider follow-up, can be translated directly

The composition of instant code: tree diagram method

For example: Suppose the source S has four different code symbols S: {s 1 , s 2 , s 3 , s 4 } The channel input symbol set, that is, the code symbol set X: {0, 1}, is required to be the same as the source symbol The lengths of the codewords w 1 , w 2 , w 3 , and w 4 corresponding to s 1 , s 2 , s 3 , and s 4 are n 1 =1, n 2 =2, n 3 =3, and n 4 =4 respectively. instant code.

Note: The paths taken from the root to each terminal node are different, and the intermediate nodes are not arranged as codewords, that is, no codeword is an extension (ie prefix) of another codeword, so it must satisfy the prefix. Restrictions (Non-Extended Code)

 Kraft's inequality

Sufficient and necessary conditions for establishment:

Note: If the code length does not satisfy Kraft's inequality, it must not be uniquely decodable; if it satisfies Kraft's inequality, it is not necessarily uniquely decodable. 

Variable length coding theorem (Shannon's first theorem)

Average length of code:

Code rate: R=H(S)/L (bit/code symbol)

R reflects the effectiveness of its transmission of information, the greater the R, the higher the effectiveness

The amount of information transmitted per second by the channel after encoding:

Obviously, the shorter L is, the larger Rt is, and the higher the information transmission efficiency is.

Compact code : the shortest decodable code, which is also the purpose of source variable-length coding

Average code length definition theorem :

The conditions for the establishment of the former equation are:

After N expansions:

Can be generalized to stationary ergodic memory sources (such as Markov sources)

is the limit entropy of the memory source

 

Guess you like

Origin blog.csdn.net/yyfloveqcw/article/details/124290976