Simple implementation of feedforward neural network BPNN

Divided into 4 steps

initialization

Use random functions to assign weights and biases. In addition to the input layer, biases need to be added.

Calculate output

Calculate the output of each neuron
Insert image description hereInsert image description here

Calculation error

Error calculation of output layer
Insert image description here
Error calculation of intermediate layer
Insert image description here

Adjust weights and biases

Insert image description here
Among them, L is the learning rate, which can be set to 0.9. Next, adjust the weight
Insert image description here

accomplish

Insert image description here
The matlab implementation code is as follows

%% 简单前馈神经网络

% 初始化权重和偏置
W12=rand([3,2]);
W23=rand([2,1]);

B12=rand([1,2]);
B23=rand([1,1]);
% 输出
input=[1,0,1];

for i=1:1000
    % 计算输出
    S2=input*W12+B12;
    O2=(1+exp(-S2)).^-1;
    S3=O2*W23+B23;
    O3=(1+exp(-S3)).^-1;

    % 计算误差
    E3=O3.*(1-O3).*(1-O3);
    E2=O2.*(1-O2).*(sum(W23*E3,2).'); 

    % 调整
    W23=W23+(0.9)*O3.*E3;
    W12=W12+(0.9)*O2.*E2;

    B12=B12+(0.9)*E2;
    B23=B23+(0.9)*E3;
end

Guess you like

Origin blog.csdn.net/qq_40092672/article/details/111617727