OFDM+QPSK signal detection algorithm matlab simulation based on DNN deep learning network

Table of contents

1. Algorithm operation rendering preview

2. Algorithm running software version

3. Some core programs

4. Overview of algorithm theory

5. Algorithm complete program engineering


1. Algorithm operation rendering preview

2. Algorithm running software version

matlab2022a

3. Some core programs

.............................................................................
Transmitted_signal                 = OFDM_Transmitter(data_in_IFFT, NFFT, NCP);
 
 
        %信道
        Ray_h_ofdm             = (1 / sqrt(2)) * randn(len_symbol, 1) + (1 / sqrt(2)) * 1j * randn(len_symbol, 1); % Rayleigh channel coff
        Rayleigh_h_channel     = repmat(Ray_h_ofdm, Frame_size, 1);
        Rayleigh_Fading_Signal = awgn(Rayleigh_h_channel .* Transmitted_signal,SNR,'measured');
        signal_ideal           = Rayleigh_Fading_Signal ./ Rayleigh_h_channel;
 
        Multitap_h = [(randn + 1j * randn);...
                      (randn + 1j * randn) / 24] ;
        
        %卷积通过信道
        Multipath_Signal        = conv(Transmitted_signal, Multitap_h);
 
        Multipath_Signal        = awgn(Multipath_Signal(1 : length(Transmitted_signal)),SNR,'measured');
        % OFDM 接收
        [Rsignals0, Rsignalsh0] = OFDM_Receiver(Multipath_Signal, NFFT, NCP, len_symbol, signal_ideal);
 
        % 进行深度学习部分,使用已训练好的神经网络进行解调
        [DNN_feature_signal, ~, ~] = Extract_Feature_OFDM(Rsignals0, dataSym(1:2), M, QPSK_signal(1:8));
        Received_data_DNN          = predict(DNN_Trained, DNN_feature_signal);
        Received_data_DNN          = transpose(Received_data_DNN);
        DNN_Received_data          = Received_data_DNN(1:2:end, :) + 1j * Received_data_DNN(2:2:end, :);
 
        DNN_dataSym_Rx             = QPSK_Demodulator(DNN_Received_data);
        
        DNN_dataSym_Received       = de2bi(DNN_dataSym_Rx, 2);
        DNN_Data_Received          = reshape(DNN_dataSym_Received, [], 1);
 
        DNN_sym_err(ij, 1)         = sum(sum(round(dataSym(1:8)) ~= round(DNN_dataSym_Rx)));
        DNN_bit_err(ij, 1)         = sum(sum(round(reshape(de2bi(dataSym(1:8), 2),[],1)) ~= round(DNN_Data_Received)));  
    end
    Bers(idx, 1) = sum(DNN_bit_err, 1) / N_bits_DNN; % 计算平均比特误码率
    Sers(idx, 1) = sum(DNN_sym_err, 1) / N_QPSK_DNN; % 计算平均符号误码率
0029

4. Overview of algorithm theory

         Orthogonal Frequency Division Multiplexing (OFDM) is a multi-carrier modulation technique that has been widely used in the field of digital communications. OFDM signal detection is one of the key issues at the receiving end, the purpose is to restore the received OFDM signal to the original data. Because OFDM signals have the characteristics of high bandwidth efficiency and resistance to multipath fading, high-speed data transmission can be realized in high-speed mobile environments. However, there are some difficulties in the detection of OFDM signals, such as frequency offset, channel estimation error, multipath interference and so on. To solve these problems, deep learning techniques have been widely used in OFDM signal detection in recent years.

1. OFDM signal model

       OFDM signal is a multi-carrier modulation technique based on frequency domain decomposition. OFDM signal can be expressed as:

$$x(t)=\sum_{n=0}^{N-1}\sum_{k=0}^{K-1}s_{n,k}g(t-nT)e^{j2\pi k\Delta f(t-nT)}$$

      Among them, $s_{n,k}$ is the data symbol, $g(t)$ is the orthogonal rectangular pulse, $T$ is the symbol interval, $K$ is the number of subcarriers, $\Delta f$ is the subcarrier interval . OFDM signals can transmit data by mapping data symbols to various subcarriers, and each subcarrier has its own modulation method and modulation parameters.

2. DNN deep learning network

      DNN deep learning network is a machine learning algorithm based on multi-layer neural network. DNN deep learning network can learn advanced features of data through multiple hidden layers, so as to realize tasks such as classification and regression of data. The mathematical model of DNN deep learning network can be expressed as:

$$y=f(W^{(L)}f(W^{(L-1)}...f(W^{(1)}x+b^{(1)})...)+b^{(L)})$$

Among them, $x$ is the input data, $y$ is the output data, $W^{(i)}$ and $b^{(i)}$ are the weights and biases of the $i$th layer, $f$ is the activation function.

3. DNN-based OFDM signal detection model

The DNN-based OFDM signal detection model can be expressed as:

$$\hat{s}{n,k}=\arg\max{s_{n,k}}P(s_{n,k}|r_{n,k},\theta)$$

        Among them, $\hat{s} {n,k}$ is the predicted data symbol, $r {n,k}$ is the received OFDM signal, and $\theta$ is the model parameter. The model can learn the mapping relationship of OFDM signals through DNN deep learning network, so as to realize the detection of OFDM signals.

       In practical applications, it is necessary to realize real-time OFDM signal detection. This can be achieved by deploying the trained model to a real system. In the process of real-time detection, it is necessary to preprocess the received OFDM signal and input it into the trained model for detection. The implementation of real-time detection needs to take into account factors such as time delay and resource constraints.

      OFDM signal detection based on DNN deep learning network has been widely used in the field of digital communication. It can be used to solve some difficult problems in OFDM signal detection, such as frequency offset, channel estimation error, multipath interference, etc. In addition, it can also be used in radio spectrum sensing, radio interference detection and other fields.

5. Algorithm complete program engineering

OOOOO

OOO

O

Guess you like

Origin blog.csdn.net/aycd1234/article/details/131924081