Matlab mathematical modeling - neural network: test the number of different hidden layer neurons, change the learning function

Table of contents

By comparing the error with the number of training steps, the number of hidden layers is determined, and the impact of the number of hidden layers on performance is tested.

1) trainlm algorithm

 2) traindm algorithm

 3) trainrp algorithm

 4) traindx algorithm

 5) traincgf algorithm


By comparing the error with the number of training steps, the number of hidden layers is determined, and the impact of the number of hidden layers on performance is tested.

The range of the hidden layer is based on the design experience formula, and the actual situation of this example, 9:16 is selected

%变量x范围
x=-4:0.01:4;
%输入目标函数
y1=sin((1/2)*pi*x)+sin(pi*x);
%隐藏层的神经元个数
s=9:16;
%欧式距离
res=1:8;
%选不同的隐藏层数目,进行网络测试
for i=1:8
%建立前向型BP网络,输入层和隐藏层激励函数为tansig, 输出层为purelin
%训练函数为trainlm,也是默认函数
net=newff(minmax(x),[1,s(i),1],{'tansig','tansig','purelin'},'trainlm');
%训练步数最大200
net.trainparam.epochs=2000;
%设定目标误差0.00001
net.trainparam.goal=0.00001;
%训练
net=train(net,x,y1);
%仿真
y2=sim(net,x);
%求欧式距离,判定性能
    err=y2-y1;
    res(i)=norm(err);
end

operation result

 

 Various BP network learning algorithms use different learning functions, so only the learning function needs to be changed here.

1) trainlm algorithm

%变量x范围
x=-4:0.01:4;
%输入目标函数
y1=sin((1/2)*pi*x)+sin(pi*x);
%建立前向型BP网络,输入层和隐藏层激励函数为tansig, 输出层为purelin
%训练函数为trainlm,也是默认函数
net=newff(minmax(x),[1,s(i),1],{'tansig','tansig','purelin'},'trainlm');
%训练步数最大200
net.trainparam.epochs=2000;
%设定目标误差0.00001
net.trainparam.goal=0.00001;
%训练
net=train(net,x,y1);
%仿真
y2=sim(net,x);
%求欧式距离,判定性能
err=y2-y1;
res(i)=norm(err);
plot(x,y1);
hold on
plot(x,y2,'r+');

The simulated network error curve (left) and the network simulation curve (right)

Pay attention to the network simulation curve, 'r+' is the network simulation, '—' is the target curve

 2) traindm algorithm

%变量x范围
x=-4:0.01:4;
%输入目标函数
y1=sin((1/2)*pi*x)+sin(pi*x);
%建立前向型BP网络,输入层和隐藏层激励函数为tansig, 输出层为purelin
%训练函数为trainlm,也是默认函数
net=newff(minmax(x),[1,s(i),1],{'tansig','tansig','purelin'},'traingdm');
%训练步数最大200
net.trainparam.epochs=2000;
%设定目标误差0.00001
net.trainparam.goal=0.00001;
%训练
net=train(net,x,y1);
%仿真
y2=sim(net,x);
%求欧式距离,判定性能
err=y2-y1;
res(i)=norm(err);
plot(x,y1);
hold on
plot(x,y2,'r+');

The simulated network error curve (left) and the network simulation curve (right)

Pay attention to the network simulation curve, 'r+' is the network simulation, '—' is the target curve

 3) trainrp algorithm

%变量x范围
x=-4:0.01:4;
%输入目标函数
y1=sin((1/2)*pi*x)+sin(pi*x);
%建立前向型BP网络,输入层和隐藏层激励函数为tansig, 输出层为purelin
%训练函数为trainlm,也是默认函数
net=newff(minmax(x),[1,s(i),1],{'tansig','tansig','purelin'},'trainrp');
%训练步数最大200
net.trainparam.epochs=2000;
%设定目标误差0.00001
net.trainparam.goal=0.00001;
%训练
net=train(net,x,y1);
%仿真
y2=sim(net,x);
%求欧式距离,判定性能
err=y2-y1;
res(i)=norm(err);
plot(x,y1);
hold on
plot(x,y2,'r+');

The simulated network error curve (left) and the network simulation curve (right)

Pay attention to the network simulation curve, 'r+' is the network simulation, '—' is the target curve

 4) traindx algorithm

%变量x范围
x=-4:0.01:4;
%输入目标函数
y1=sin((1/2)*pi*x)+sin(pi*x);
%建立前向型BP网络,输入层和隐藏层激励函数为tansig, 输出层为purelin
%训练函数为trainlm,也是默认函数
net=newff(minmax(x),[1,s(i),1],{'tansig','tansig','purelin'},'traingdx');
%训练步数最大200
net.trainparam.epochs=2000;
%设定目标误差0.00001
net.trainparam.goal=0.00001;
%训练
net=train(net,x,y1);
%仿真
y2=sim(net,x);
%求欧式距离,判定性能
err=y2-y1;
res(i)=norm(err);
plot(x,y1);
hold on
plot(x,y2,'r+');

The simulated network error curve (left) and the network simulation curve (right)

Pay attention to the network simulation curve, 'r+' is the network simulation, '—' is the target curve

 5) traincgf algorithm

%变量x范围
x=-4:0.01:4;
%输入目标函数
y1=sin((1/2)*pi*x)+sin(pi*x);
%建立前向型BP网络,输入层和隐藏层激励函数为tansig, 输出层为purelin
%训练函数为trainlm,也是默认函数
net=newff(minmax(x),[1,s(i),1],{'tansig','tansig','purelin'},'traincgf');
%训练步数最大200
net.trainparam.epochs=2000;
%设定目标误差0.00001
net.trainparam.goal=0.00001;
%训练
net=train(net,x,y1);
%仿真
y2=sim(net,x);
%求欧式距离,判定性能
err=y2-y1;
res(i)=norm(err);
plot(x,y1);
hold on
plot(x,y2,'r+');

The simulated network error curve (left) and the network simulation curve (right)

Pay attention to the network simulation curve, 'r+' is the network simulation, '—' is the target curve

Uncertainty factors will have different effects on network training and produce different effects.

Guess you like

Origin blog.csdn.net/qq_54508596/article/details/127140031