Box depth communications network (3) | from the encoder: An Introduction to Deep Learning for the Physical Layer

This article addresses: https://arxiv.org/pdf/1702.00832.pdf
GitHub Address: https://github.com/musicbeer/Deep-Learning-for-the-Physical-Layer

Foreword

Depth communication network column | from the encoder : This column originally planned to organize 18 - 19 years of paper, although this is a 17-year paper, but for the first time put forward the concept of self encoder. Now look back at the content is relatively simple to understand, but because of its high amount of references, which proposed the concept of self encoder and expert domain knowledge combined with NN RTN network structure is referenced articles, so here is also to be a "reading notes "to facilitate the return at a later date. Here part of the paper-related records from the encoder only.

The main contribution of the article

The proposed system is understood as a communication from the encoder, in a single process co-optimization of sending and receiving end, this idea will be expanded to MIMO systems and presents RTNs (radio transformer networks) concept, the ML model in conjunction with experts in the field of knowledge.

Full overview

siso

Here Insert Picture Description
Process:
(1) transmitter: a signal s (kbits) encoded into the expression X (n-dimensional vector) having a higher transmission robustness and transmission
(2) channel: the AWGN interference x, the output Y
(. 3) received machine: the receiver according y decoded possible probability signal of M types, the signal maximum probability reconstructed from
Here Insert Picture Description
neural network structure as described above, the input s to onehot encoding, channel variance stationary Gaussian channel, the snr = when 7db training loss function using cross entropy, NN normalized transmission layer ensures x satisfies the power constraint, the probability of receiving the output terminal NN. Network parameters are as follows:
Here Insert Picture Description
SNR-BER follows: Performance autoencoder (8,8) than the uncoded bpsk (8,8), the joint described NN learned coded modulation scheme, the coding gain achieved. Here Insert Picture Description
Coding results are as follows:
Here Insert Picture Description

despite

Consider the case where two 2 admission:
Here Insert Picture Description
compared with the case where siso, the introduction channel interference

Process neural network for:
transmitting NN1 s1 is encoded as x1, transmission NN2 s2 is encoded x2
to x1 + x2 + n1 input receiving NN1, get s1 estimated
to x1 + x2 + n2 input receiving NN2, to obtain an estimated s2

NN1 and NN2 with siso using the same number of layers and neurons arranged two optimized separately from the encoder,
the loss function is set to L ~ = a L ~ 1 + ( 1 a ) L ~ 2 \tilde{L}=\alpha \tilde{L}_{1}+(1-\alpha) \tilde{L}_{2} Wherein the variable α, a t + 1 = L ~ 1 ( θ t ) L ~ 1 ( θ t ) + L ~ 2 ( θ t ) , t > 0 \alpha_{t+1}=\frac{\tilde{L}_{1}\left(\boldsymbol{\theta}_{t}\right)}{\tilde{L}_{1}\left(\boldsymbol{\theta}_{t}\right)+\tilde{L}_{2}\left(\boldsymbol{\theta}_{t}\right)}, \quad t>0

ber-snr曲线如下:
Here Insert Picture Description

对比参考曲线为 2 2 k / n Q A M 2^{2 k / n}-\mathrm{Q} \mathrm{AM} + time-sharing

星座图如下:
Here Insert Picture Description

( a)性能与bpsk相等
( b)性能与4qam相等
( c)性能比4qam好0.7db
( d)性能比16qam好1db

因为对比曲线的选取原因,这样的结果也在情理之中,当采用(4,4)或(4,8)时,对比曲线相当于使用了重复编码,这一的方式并不能充分利用自由度,因此神经网络能达到更好的性能。

RTN的引入

Here Insert Picture Description
A receiver structure in place prior to the above structure, the input parameters for y estimation network, estimating the useful parameters (e.g., frequency offset, symbol time, impulse response, etc.), and then use these parameters NN estimated from the received signal is corrected, and finally enter discrimination network. This structure is well known to the NN with the traditional algorithm combine to achieve the integration of neural network and knowledge of experts in the field, to achieve a lower ber, accelerate the training speed.
Simulation results:
Here Insert Picture Description

Published 43 original articles · won praise 85 · views 720 000 +

Guess you like

Origin blog.csdn.net/weixin_39274659/article/details/90648921