Classification prediction | Matlab implements HPO-GRU [23-year new algorithm] based on the predator optimization algorithm to optimize the data classification prediction of the gated loop unit

Classification prediction | Matlab implements DBO-SVM dung beetle algorithm to optimize support vector machine data classification prediction [23 years new algorithm]

Classification effect

Insert image description here

Insert image description here
Insert image description here
Insert image description here
Insert image description here

Basic description

1.HPO-GRU [23-year new algorithm] Based on the predator optimization algorithm, the data classification prediction of the gated cyclic unit can be directly run in the Maltab language (complete source code and data)
2. For multi-input classification prediction, use the predator optimization algorithm HPO to optimize the three parameters of GRU, namely learning rate, hidden layer nodes, and regularization coefficient, to avoid blind manual selection of parameters. The program includes iteration curve chart, confusion matrix chart, prediction result chart effect as shown in the figure. The code quality is extremely high~
3. The predator optimization algorithm is a new optimization algorithm released in 2022. It simulates animal hunting It has the advantages of fast convergence speed and strong optimization ability. It is suitable for innovation~
4. It can be used by directly replacing the Excel data and is suitable for novices~< a i=4> 5. Comes with test data and runs the main directly to produce a picture with one click

programming

  • Complete program and data private message blogger replyMatlab implements HPO-GRU [23-year new algorithm] based on the predator optimization algorithm to optimize the data classification prediction of the gated cyclic unit a>.
%%  参数设置
% 定义优化参数的个数,在该场景中,优化参数的个数dim为2% 定义优化参数的上下限,如c的范围是[0.01, 1], g的范围是[2^-5, 2^5],那么参数的下限lb=[0.01, 2^-5];参数的上限ub=[1, 2^5]%目标函数
fun = @getObjValue; 
% 优化参数的个数 (c、g)
dim = 2;
% 优化参数的取值下限
lb = [10^-1, 1];
ub = [10^2, 2^8];

%%  参数设置
pop =6; %种群数量
maxgen=100;%最大迭代次数
%% 优化(这里主要调用函数)
c = Best_pos(1, 1);  
g = Best_pos(1, 2); 
toc
% 用优化得到c,g训练和测试
cmd = ['-s 0 -t 2 ', '-c ', num2str(c), ' -g ', num2str(g), ' -q'];
model = libsvmtrain(T_train, P_train, cmd);
————————————————
版权声明:本文为CSDN博主「机器学习之心」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/kjm13182345320/article/details/134843675

References

[1] https://blog.csdn.net/kjm13182345320/article/details/129036772?spm=1001.2014.3001.5502
[2] https://blog.csdn.net/kjm13182345320/article/details/128690229

Guess you like

Origin blog.csdn.net/kjm13182345320/article/details/134984012