Matlab Mathematical Modeling - Classic Application of Neural Networks: Approximating Nonlinear Functions

Table of contents

Code: First draw the function to be approximated, and then use the untrained neural network to approximate it

Next step: Increase the value of n (the number of hidden layers in the neural network)

 Change the frequency parameter k as follows:


Goal: Design a BP network to approximate nonlinear functions

Code: First draw the function to be approximated, and then use the untrained neural network to approximate it

clear all
clc
k=2;
p=[-1:0.05:9];
t=1+sin(k*pi/2*p);
%%%%开始建立一个网络结构%%%%
n=5;
net=newff(minmax(p),[n,1],{'tansig','purelin'},'trainlm');
y1=sim(net,p);%用sim 观察网络的输出
figure;
plot(p,t,'-',p,y1,':');
title('没有训练的输出结果');
xlabel('时间');
ylabel('仿真输出--原函数--');

Observe the effect, the effect of approximation is not good

Approximating nonlinear functions with trained neural networks

clear all
clc
k=2;
p=[-1:0.05:9];
t=1+sin(k*pi/2*p);
%%%%开始建立一个网络结构%%%%
n=3;
net=newff(minmax(p),[n,1],{'tansig','purelin'},'trainlm');
y1=sim(net,p);%用sim 观察网络的输出
%%%先训练网络,再仿真%%%
net.trainParam.epochs=200;
net.trainParam.goal=0.2;
net=train(net,p,t);
y2=sim(net,p);
figure;
plot(p,t,'-',p,y1,':',p,y2,'--');
title('训练的输出结果');
xlabel('时间');
ylabel('仿真输出');

':' is untrained, '--' is trained

 The trained neural network approximation is significantly better than the untrained one.

Next step: Increase the value of n (the number of hidden layers in the neural network)

When n=5:

clear all
clc
k=2;
p=[-1:0.05:9];
t=1+sin(k*pi/2*p);
%%%%开始建立一个网络结构%%%%
n=5;
net=newff(minmax(p),[n,1],{'tansig','purelin'},'trainlm');
y1=sim(net,p);%用sim 观察网络的输出
%%%先训练网络,再仿真%%%
net.trainParam.epochs=200;
net.trainParam.goal=0.2;
net=train(net,p,t);
y2=sim(net,p);
figure;
plot(p,t,'-',p,y1,':',p,y2,'--');
title('训练的输出结果');
xlabel('时间');
ylabel('仿真输出');

 The approximation effect is better than n=3

n=20

clear all
clc
k=2;
p=[-1:0.05:9];
t=1+sin(k*pi/2*p);
%%%%开始建立一个网络结构%%%%
n=20;
net=newff(minmax(p),[n,1],{'tansig','purelin'},'trainlm');
y1=sim(net,p);%用sim 观察网络的输出
%%%先训练网络,再仿真%%%
net.trainParam.epochs=200;
net.trainParam.goal=0.2;
net=train(net,p,t);
y2=sim(net,p);
figure;
plot(p,t,'-',p,y1,':',p,y2,'--');
title('训练的输出结果');
xlabel('时间');
ylabel('仿真输出');

 It shows that increasing n (the number of hidden layers) can increase the accurate value of BP neural network prediction.

 Change the frequency parameter k as follows:

k=2,n=3 and k=2,n=10

k=3,n=3 and k=3,n=10

k=6,n=3 and k=6,n=10

 It shows that it exists at different frequencies, increase the hidden layer, and increase the approximation effect of the BP neural network.

Guess you like

Origin blog.csdn.net/qq_54508596/article/details/127139482