A Summary of Distortion-Free Source Coding of Information Theory and Coding Technology

content

Foreword:

one. The concept of source coding (basic concept, classification, requirements, purpose)

1. Requirements for source coding:

2. Classification

3. The purpose of source coding: reduce redundancy and improve efficiency

two. Fixed length code

1. The condition for the source to have a unique decipherable long code

2. Fixed-length coding theorem (emphasis: narrative, meaning, formula, calculation)

Four. Variable length code (emphasis: characteristics, tree diagram construction instant code, proof of kraft inequality, description and meaning of variable length coding theorem)

1. Features

2. Requirements

3. Kraft inequality

4. Variable length coding theorem

5. Variable-length distortion-free source coding theorem (Shannon's first theorem) (emphasis: narrative, meaning)

five. Huffman coding (emphasis, simple source (N times extended source) coding, calculation​)

1. Binary Huffman Coding

2. n-ary huffman encoding:

six. Shannon coding, Feino coding (emphasis: coding, features)


Foreword:

Four basic concepts :

A graph : Vera graph

A basic principle : the process of obtaining information is the process of reducing uncertainty

one. The concept of source coding (basic concept, classification, requirements, purpose)

1. Requirements for source coding:

①Non-singularity: Whether the block code is non-singular should be a necessary condition for correct decoding, but not a sufficient condition.

②Unique translatability:

Fixed length code: non-singular code

Variable length code: non-singular code and its N-time spread code is also non-singular.

③Instant code: (prefix code)

2. Classification

3. The purpose of source coding: reduce redundancy and improve efficiency

①Remove correlation: each code symbol in the code or code sequence is as independent as possible from each other.

② Make the probability of each code symbol appearing as equal as possible after encoding.

two. Fixed length code

1. The condition for the source to have a unique decipherable long code

Simple source:

N times extended source:

2. Fixed-length coding theorem (emphasis: narrative, meaning, formula, calculation)

N-fold extended sources of discrete memoryless sources

like

Then when N is large enough, the decoding error must be less than ( )

like

Then when N is large enough, the decoding error tends to 1. ( )

When the source is a memory source: H(S) is changed to

Pe: is the probability that the set appears:

Encoding speed:

Indicates the maximum amount of information that each source symbol can carry on average after encoding.

Coding efficiency:

Four. Variable length code (emphasis: characteristics, tree diagram construction instant code, proof of kraft inequality, description and meaning of variable length code theorem)

1. Features

When N is not large, a highly efficient and distortion-free source code can be compiled.

2. Requirements

Uniquely decodable: must be non-singular, and N-fold expansion is also non-singular.

Instant code: method of constructing instant code (tree diagram method)

3. Kraft inequality

Note: Construct a real-time code (using the tree diagram method) according to the requirements, first consider whether the kraft inequality is satisfied?

Meaning: Explain the number of source symbols, what conditions are met between the number of code symbols and the code length to constitute an instant code.

Proof: ① Sufficient ② Necessity (requires mastering the proof of Kraft's inequality)

4. Variable length coding theorem

Average code length

Unit : code symbol/source symbol. Indicates the average number of symbols required for each source symbol.

Code rate: the average amount of information carried by each symbol symbol

The amount of information transmitted per second by the channel after encoding

Theorem of defining the average code length : The entropy of the discrete memoryless source is H(S), then there must be a coding method that constitutes the only code that can be decoded, so that the average code length satisfies:

5. Variable-length distortion-free source coding theorem (Shannon's first theorem) (emphasis: narrative, meaning)

Coding rate : Indicates the maximum amount of information that each source symbol can carry on average after coding

Coding efficiency:

Residual degree of code :

five. Huffman coding (emphasis, simple source (N times extended source) coding, computation )

1. Binary Huffman Coding

minimum

2. n-ary huffman encoding:

Source adjustment q=(r-1)θ+r, where θ is the number of times of source reduction, which is a positive integer.

six. Shannon coding, Feino coding (emphasis: coding, features)

Guess you like

Origin blog.csdn.net/yyfloveqcw/article/details/124393041