FIG LDPC code EXIT

LDPC codes Introduction

  • LDPC code is a linear block code , having a hardware decoder may be implemented, and many data transmission and data storage exhibit properties approaching channel capacity.

  • LDPC code parity check matrix or the Tanner graph of FIG.

    Tanner graph of the LDPC code is similar to a convolutional code trellis diagram, which provides another method for complete representation of the LDPC code, this representation to facilitate describing decoding algorithm. Tanner graph is a bipartite graph (bipartite graph), that is, the nodes in the graph are divided into two categories, only an edge on FIG connect different types of nodes. Both node on the Tanner graph is called variable nodes (variable nodes) and check nodes (check nodes), and CN are represented by VN. Tanner view of one such code can be obtained: hij H when the element 1 is in the i-th check node (the CN i) and the j-th variable node (the VN j) is connected. According to this rule, there are m on the Tanner graph check nodes and variable nodes n; and each check node corresponds to a check equation, and each variable node corresponds to a coded bit. In addition, the H of m rows and m specifies the check nodes are connected, and the n-th column of H is defined connecting the n variable nodes. Accordingly, many of n variable nodes indicated to be just codeword with n code bit sequence.

    • Tanner graph of the LDPC code of the iterative decoder is played a role blueprint: each node on the graph corresponds to a partial operation of the processor: each edge corresponds to a bus, its role is the information from a given each node is transmitted to a node connected to it.
    • Transmission of information on the Tanner graph is generally probability information , such as log-likelihood ratio (the LLR), bit value to which the variable nodes with the relevant allocation. The origin of the n-th LDPC decoder initialization from the channel likelihood ratio logarithm of likelihood ratios which are received by the n processors VN. Based iterative decoding algorithm, at each iteration beginning of the first half of the information channel and its adjacent check node processor VN each input and calculated results based on these inputs, then the results as input is transmitted to all adjacent processors CN: in the second half of each iteration, the information input VN its neighboring processors to each of the CN, and calculates the results of these inputs, then the result is transmitted as an input VN to all neighboring processors.
    • Iterations between variable nodes and check nodes continued to successfully decode the code word or maximum number of iterations a pre-set date.
  • The length of the loop equal to the number of sides of the ring. Length l l rings are also commonly referred to as a ring. Given two figures, the minimum length of the loop of the figure is referred girth (girth) .
    Here Insert Picture Description

The introduction of outside information (extrinsic information): Point Soldiers

Here Insert Picture Description
This message passing rules to redistribute the information (extrinsic information) concept, the main idea is: a soldier will not give its neighbor soldiers pass the information already soldiers neighbor, that is only passed outside information. The total information from the soldier's precisely for this reason X, Y soldier received already removed the soldier Y own information. We call the soldiers is transmitted to the information only to the X Y soldiers, the extrinsic information calculated as follows:
Here Insert Picture Description
wherein, N (X) X is the set of all adjacent soldiers soldiers representative Ix-y transmitted from the external information to the soldier soldiers X of Y ( Iz_ → x, Iγ → x has a similar meaning), Ix sometimes called inside information (intrinsic information). It was a soldier Ix count "1" of its own, so in this example, Ix = 1.

Message passing algorithm

  • A LDPC code can be viewed as a SPC code set with a repetition code ** (REP) by concatenating a set of inner interleaver **. In addition, the SPC code can be seen as the outer code, i.e., they are not connected with the channel. Variable node side of the suspension is connected with a channel.
    Here Insert Picture Description
    Here Insert Picture Description

It describes its left REP (VN) where the decoder. Note that, VN LLR decoder also receives information from both of its adjacent nodes from the channel. However, in calculating the extrinsic information Lj → i, VN j does not need to receive information from CN i, as and in any case, the information will be removed.
It describes the case where the right SPC (CN) decoder. VN similar with the case of certain j in the calculation of Li-j, CNi not need to receive Lj i, because in any case it will be subtracted. Collaboration between the CN and the VN and each decoder in an iterative manner estimation.

  • After reaching between VN / CN eleven cycles preset maximum number of repetitions (or iterations) number, or some stopping criterion is satisfied, the decoder is calculated according to the decision made bits (estimation) of the LLR values. When the larger view of the ring, this estimate is accurate, the decoder can achieve near optimal (MAP) performance.

  • The evolution of the SPA based on the assumption of independence (independence assumption): each node receives from its neighbor nodes are independent LLR. Apparently, more than half the number of iterations when Tanner graph girth, this independence assumption breaks.

The LDPC code decoding threshold

  • Decoding a long code having a threshold effect, the channel of the threshold parameter space (such as signal to noise ratio) is divided into regions and non-reliable communication reliable communication area. It can be determined using a density evolution algorithm for decoding LDPC codes set threshold. Although primarily for binary input AWGN channel is discussed, but this method is applicable to a variety of binary input channels symmetrical output.
  • Binary input AWGN channel symmetrical output characteristics refer transition probability density function of the channel satisfies p (y | x = + 1) = p (-y | x = -1), a binary symmetric channel and other channels also have a similar relationship. Iterative decoder using the decoding algorithm and the same product output meets the required channel four symmetry condition, which is not described in detail.
  • Suppose sending all zeroes codeword c = [00 ... 0], using x = (- 1) ° mapping, when the transmission channel is a full sequence 1 x = [+ 1 + 1 ... +1]. As the evolutionary algorithm to determine the density (do, de) - regular LDPC code decoding threshold tool. Since the transmission is a full sequence 1 x = [+ 1 +1 ... +1], so that after reaching the maximum number of iterations, decoder if any of a number of variable node cumulative likelihood ratio Ltotal is negative, then decoding error will occur.
  • Satisfy the symmetry condition for the channel and the decoder, all variable node outgoing message probability density function is the same. Then the probability, after infinite iterations, Ltotal if the following holds either a variable node has a negative value is 0, then no decoding error.
    Here Insert Picture Description
    Depending on the channel parameter a. For example, for the BSC, a is the error probability [epsilon]; for the AWGN channel, a is the channel noise standard deviation σ. When the code length η → ∞, is defined as a threshold decoder
    Here Insert Picture Description
    AWGN channel, the decoder is a corresponding threshold signal to noise ratio.

Gaussian approximation

For binary input AWGN channel, in addition to the quantized density evolution, the Gaussian approximation based on density evolution enables more simple and stable numerical 71. The idea of ​​the algorithm is a probability density function by a Gaussian density function (Gaussian mixture density or function) is approximately message. Due to the Gaussian density function is completely determined by the mean and variance of these two parameters, and therefore at (approximately) the evolution of density evolution requires only these two parameters. Under conditions consistent assumptions can be determined only approximately by the evolution of the decoding threshold mean message, thereby achieving simplification. If the probability density function Pm message m satisfy the following formula, consistency condition is satisfied:

Here Insert Picture Description

Therefore, channel message satisfies the consistency condition is also assumed that other information is also approximately the condition is satisfied. In addition to meeting the conditions of variance consistency Gaussian probability density function, the following relationship:
Here Insert Picture Description
we call normal density N (μ, 2μ) is consistent with normal density. This means that when the message satisfies the consistency condition, using the Gaussian approximation density evolution algorithm may calculate the mean of the message only. Thus, for consistency condition satisfies / SNR evolution Gaussian approximation, is only average information dissemination.

EXIT regular LDPC code of FIG.

  • Technical EXIT chart : estimate of the LDPC code set and a graphical tool Turbo decoding code set threshold, may provide visual information and dynamic convergence of iterative decoding. This technique based on Gaussian approximation, and can provide visual information and dynamic convergence of iterative decoding.

  • Idea : variable node processing unit (VNP) and check node processing unit (CNP) based on the joint work done bit decision and iteratively.

    Since the output of a processing unit of measurement is a measure of its neighboring processing an input unit, a transfer curve can be drawn to two same coordinate system, but the one processing unit of the abscissa and ordinate to be exchanged.

  • Action : help predict coding threshold for a given code set variable nodes and check node degree distributions.

  • Assumptions : acyclic graph, and the code length Unlimited Unlimited decoding iterations.

    FIG 9.7 embodiment, the metric is outside of the transfer curve mutual information, which is transferred to the information (the EXIT) FIG origin of the name.
    Here Insert Picture Description

  • Top (solid line) IE.v IA.V curve is the VNP information corresponding to the outer profile. It depicts the mutual relationship and mutual information IA.v IE.v information and extrinsic information (a priori information) corresponding to the VPN extrinsic information output of the input VPN.

  • A bottom end (broken line) IA, extrinsic information for IEs curve c, c is a curve corresponding to CNP, which depicts the mutual information IA, C IEs with mutual information, the relationship c, corresponding to the input information outside of CPN (a priori relationship information) and the output of the outer CPN (this curve is obtained by calculating IE, v is a function of the determined IA.V, then to draw the EXIT chart, the coordinates of the drawing with opposite).

  • In between these two curves and the product is iterative decoder decodes the track . Note that, since the outer information output VNP (CNP) are CNP (VNP) prior information input, the decoding trajectory between two curves " bouncing ." The trace at the (0,0) point (zero information), and finally converges to (1,1) point (1 bit information, error-free). This allows us to observe the amount of information tracks between CNP and VNP repeatedly transmitted (in bits). When the amount of information transmitted is close to 1, the error rate approaches zero.

  • When the SNR increases channel, the top end (the VNP) curves shift, while the channel spacing and the speed of decoder convergence between the two curves are also increased. More coding threshold SNR is exactly in this figure (3, 6) code set, which is higher than (Eb / No) exIT = 1.1 dB. If the SNR is below this value, the channel will be closed, which blocks the track to the line-decoding error rate point 0 (1,1).

Published 76 original articles · won praise 30 · views 5851

Guess you like

Origin blog.csdn.net/weixin_45926367/article/details/104547482