Classification prediction | MATLAB implements GAPSO-BP genetic algorithm combined with particle swarm optimization algorithm to optimize BP neural network multi-input classification prediction

Classification prediction | MATLAB implements GAPSO-BP genetic algorithm combined with particle swarm optimization algorithm to optimize BP neural network multi-input classification prediction

predictive effect

insert image description here

insert image description here
insert image description here
insert image description here

basic introduction

1. Classification prediction | MATLAB implements GAPSO-BP genetic algorithm combined with particle swarm optimization algorithm to optimize BP neural network multi-input classification prediction, operating environment Matlab2018b and above; 2. Visual display of classification accuracy, data and program content can be obtained in the download area
.
3. Input 15 features and output 4 types of labels.

Model description

Genetic algorithm performs better in solving complex global optimization problems, but it is not as good as particle swarm optimization in dealing with local search problems. The particle swarm optimization algorithm is more suitable for the optimization of high-dimensional, multi-extreme, and continuous problems. Therefore, combining these two algorithms can complement each other and improve the optimization performance.

programming

  • Complete program and data acquisition method 1: program exchange of equal value;
  • Complete program and data acquisition method 2: private message blogger reply GAPSO-BP genetic algorithm combined with particle swarm optimization algorithm to optimize BP neural network multi-input classification prediction acquisition.
%%  划分训练集和测试集
P_train = res(1: num_train_s, 1: f_)';
T_train = res(1: num_train_s, f_ + 1: end)';
M = size(P_train, 2);

P_test = res(num_train_s + 1: end, 1: f_)';
T_test = res(num_train_s + 1: end, f_ + 1: end)';
N = size(P_test, 2);

%%  数据归一化
[p_train, ps_input] = mapminmax(P_train, 0, 1);
p_test = mapminmax('apply', P_test, ps_input);

[t_train, ps_output] = mapminmax(T_train, 0, 1);
t_test = mapminmax('apply', T_test, ps_output);
%%  个体极值和群体极值
[fitnesszbest, bestindex] = min(fitness);
zbest = pop(bestindex, :);     % 全局最佳
gbest = pop;                   % 个体最佳
fitnessgbest = fitness;        % 个体最佳适应度值
BestFit = fitnesszbest;        % 全局最佳适应度值

%%  迭代寻优
for i = 1 : maxgen
    for j = 1 : sizepop
        
        % 速度更新
        V(j, :) = V(j, :) + c1 * rand * (gbest(j, :) - pop(j, :)) + c2 * rand * (zbest - pop(j, :));
        V(j, (V(j, :) > Vmax)) = Vmax;
        V(j, (V(j, :) < Vmin)) = Vmin;
        
        % 种群更新
        pop(j, :) = pop(j, :) + 0.2 * V(j, :);
        pop(j, (pop(j, :) > popmax)) = popmax;
        pop(j, (pop(j, :) < popmin)) = popmin;
        
        % 自适应变异
        pos = unidrnd(numsum);
        if rand > 0.95
            pop(j, pos) = rands(1, 1);
        end
        
        % 适应度值
        fitness(j) = fun(pop(j, :), hiddennum, net, p_train, t_train);

    end
    
    for j = 1 : sizepop

        % 个体最优更新
        if fitness(j) < fitnessgbest(j)
            gbest(j, :) = pop(j, :);
            fitnessgbest(j) = fitness(j);
        end

        % 群体最优更新 
        if fitness(j) < fitnesszbest
            zbest = pop(j, :);
            fitnesszbest = fitness(j);
        end

    end

    BestFit = [BestFit, fitnesszbest];    
end
————————————————
版权声明:本文为CSDN博主「机器学习之心」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/kjm13182345320/article/details/130462492


References

[1] https://blog.csdn.net/kjm13182345320/article/details/129679476?spm=1001.2014.3001.5501
[2] https://blog.csdn.net/kjm13182345320/article/details/129659229?spm=1001.2014.3001.5501
[3] https://blog.csdn.net/kjm13182345320/article/details/129653829?spm=1001.2014.3001.5501

Guess you like

Origin blog.csdn.net/kjm13182345320/article/details/132287121