Genetic Algorithm GA Optimized BP Neural Network (GA-BP) Regression Prediction-Matlab Code Implementation

I. Introduction

    Genetic Algorithm (GA) and backpropagation neural network (Backpropagation Neural Network, BPNN) are commonly used optimization algorithms and models, which can be used in combination to optimize regression prediction problems.

    When using genetic algorithm to optimize the regression prediction problem of BP neural network, the search ability of genetic algorithm optimization algorithm and the learning ability of BP neural network can be combined to make full use of the advantages of both, so as to obtain better regression prediction results.

    The main idea of ​​genetic algorithm is to simulate the process of natural selection and fitness increase in the process of biological evolution, and continuously optimize the fitness of the population through operations such as selection, crossover and mutation, and finally obtain the optimal solution. When using genetic algorithm to optimize the regression prediction problem of BP neural network, the parameters of BP neural network are coded into a chromosome, and each gene represents the value of a parameter. By constantly updating the chromosomes of the population, that is, constantly updating the parameters of the BP neural network, in order to obtain better regression prediction results.

    BP neural network is a commonly used artificial neural network model with strong learning ability and adaptability. The BP neural network continuously adjusts the weights and bias items through the backpropagation algorithm to gradually reduce the prediction error, so as to obtain more accurate regression prediction results. When using the genetic algorithm to optimize the regression prediction problem of the BP neural network, the parameters of the BP neural network are used as the objective function of the genetic algorithm optimization, and the prediction error is used as the fitness function to continuously update the parameters of the BP neural network in order to obtain a better regression forecast result.

Code acquisition: click here to jump directly

Two, part of the code

net=newff(inputn,outputn,hiddennum_best,{'tansig','purelin'},'trainlm');% 建立模型

%网络参数配置
net.trainParam.epochs=1000;         % 训练次数
net.trainParam.lr=0.01;             % 学习速率
net.trainParam.goal=0.00001;        % 训练目标最小误差
net.trainParam.show=25;             % 显示频率
net.trainParam.mc=0.01;             % 动量因子
net.trainParam.min_grad=1e-6;       % 最小性能梯度
net.trainParam.max_fail=6;          % 最高失败次数

%初始化ga参数 
PopulationSize_Data=30;       %初始种群规模
MaxGenerations_Data=60;       %最大进化代数
CrossoverFraction_Data=0.8;   %交叉概率
MigrationFraction_Data=0.2;   %变异概率
nvars=inputnum*hiddennum_best+hiddennum_best+hiddennum_best*outputnum+outputnum; 
lb=repmat(-3,nvars,1);    %自变量下限
ub=repmat(3,nvars,1);     %自变量上限

%调用遗传算法函数
options = optimoptions('ga');
options = optimoptions(options,'PopulationSize', PopulationSize_Data);
options = optimoptions(options,'CrossoverFraction', CrossoverFraction_Data);
options = optimoptions(options,'MigrationFraction', MigrationFraction_Data);
options = optimoptions(options,'MaxGenerations', MaxGenerations_Data);
options = optimoptions(options,'SelectionFcn', @selectionroulette);   %轮盘赌选择
options = optimoptions(options,'CrossoverFcn', @crossovertwopoint);   %两点交叉
options = optimoptions(options,'MutationFcn', {  @mutationgaussian [] [] });   %高斯变异
options = optimoptions(options,'Display', 'iter');    %‘off’为不显示迭代过程,‘iter’为显示迭代过程
options = optimoptions(options,'PlotFcn', { @gaplotbestf });    %最佳适应度作图
%求解
[x,fval] = ga(@fitness,nvars,[],[],[],[],lb,ub,[],[],options);

3. Simulation results

(1) According to the empirical formula, the optimal number of hidden layer nodes is obtained through the number of input and output nodes:

(2) Prediction comparison chart and error chart

 (3) Various error indicators of BP and GA-BP

(4) Genetic algorithm GA fitness value evolution curve

 (5) Regression diagram of BP and GA-BP models

(6) Error histograms of BP and GA-BP models

Four. Conclusion

    It should be noted that both the genetic algorithm and the BP neural network are algorithms based on randomness, so the optimization results of the same set of parameters may be different, and multiple repeated experiments are required to verify the robustness and reliability of the model.

Guess you like

Origin blog.csdn.net/baoliang12345/article/details/130331210