Time series prediction | MATLAB implements SSA-ELM sparrow algorithm to optimize extreme learning machine time series prediction
Table of contents
List of effects
basic introduction
1. MATLAB implements SSA-ELM sparrow algorithm to optimize extreme learning machine time series forecasting;
2. Univariate time series forecasting;
3. Operating environment Matlab2018 and above, just run the main program main, and the rest are function files without running, all programs are placed In a folder, data is the data set;
4. The SSA-ELM sparrow algorithm optimizes the weight and bias of the extreme learning machine, and the command window outputs evaluation indicators such as RMSE, MAE, R2, and MAPE.
programming
- Complete program and data download method 1 (download directly from the resource): MATLAB implements SSA-ELM sparrow algorithm to optimize extreme learning machine time series prediction
- Complete program and data download method 2 (subscribe to the "ELM Extreme Learning Machine" column, and at the same time you can read all the content included in the "ELM Extreme Learning Machine" column, and private message me to get the data after subscription): MATLAB implements SSA-ELM sparrow algorithm to optimize extreme learning Machine Time Series Forecasting
- Complete program and data download method 3 (subscribe to the "Smart Learning" column, and at the same time get 5 copies of the program included in the "Smart Learning" column, private message me to get the data after subscription): MATLAB implements SSA-ELM sparrow algorithm to optimize extreme learning machine time series prediction
function [Best_pos, Best_score, curve, avcurve] = SSA(pop, Max_iter, lb, ub, dim, fobj)
%% 参数设置
ST = 0.8; % 预警值
PD = 0.2; % 发现者的比列,剩下的是加入者
PDNumber = pop * PD; % 发现者数量
SDNumber = pop - pop * PD; % 意识到有危险麻雀数量
%% 判断优化参数个数
if(max(size(ub)) == 1)
ub = ub .* ones(1, dim);
lb = lb .* ones(1, dim);
end
%% 种群初始化
pop_lsat = initialization(pop, dim, ub, lb);
pop_new = pop_lsat;
%% 计算初始适应度值
fitness = zeros(1, pop);
for i = 1 : pop
fitness(i) = fobj(pop_new(i, :));
end
%% 得到全局最优适应度值
[fitness, index]= sort(fitness);
GBestF = fitness(1);
%% 得到全局最优种群
for i = 1 : pop
pop_new(i, :) = pop_lsat(index(i), :);
end
GBestX = pop_new(1, :);
X_new = pop_new;
%% 优化算法
for i = 1: Max_iter
BestF = fitness(1);
R2 = rand(1);
for j = 1 : PDNumber
if(R2 < ST)
X_new(j, :) = pop_new(j, :) .* exp(-j / (rand(1) * Max_iter));
else
X_new(j, :) = pop_new(j, :) + randn() * ones(1, dim);
end
end
for j = PDNumber + 1 : pop
if(j > (pop - PDNumber) / 2 + PDNumber)
X_new(j, :) = randn() .* exp((pop_new(end, :) - pop_new(j, :)) / j^2);
else
A = ones(1, dim);
for a = 1 : dim
if(rand() > 0.5)
A(a) = -1;
end
end
AA = A' / (A * A');
X_new(j, :) = pop_new(1, :) + abs(pop_new(j, :) - pop_new(1, :)) .* AA';
end
end
Temp = randperm(pop);
SDchooseIndex = Temp(1 : SDNumber);
for j = 1 : SDNumber
if(fitness(SDchooseIndex(j)) > BestF)
X_new(SDchooseIndex(j), :) = pop_new(1, :) + randn() .* abs(pop_new(SDchooseIndex(j), :) - pop_new(1, :));
elseif(fitness(SDchooseIndex(j)) == BestF)
K = 2 * rand() -1;
X_new(SDchooseIndex(j), :) = pop_new(SDchooseIndex(j), :) + K .* (abs(pop_new(SDchooseIndex(j), :) - ...
pop_new(end, :)) ./ (fitness(SDchooseIndex(j)) - fitness(end) + 10^-8));
end
end
%% 边界控制
for j = 1 : pop
for a = 1 : dim
if(X_new(j, a) > ub(a))
X_new(j, a) = ub(a);
end
if(X_new(j, a) < lb(a))
X_new(j, a) = lb(a);
end
end
end
%% 获取适应度值
for j = 1 : pop
fitness_new(j) = fobj(X_new(j, :));
end
%% 获取最优种群
for j = 1 : pop
if(fitness_new(j) < GBestF)
GBestF = fitness_new(j);
GBestX = X_new(j, :);
end
end
%% 更新种群和适应度值
pop_new = X_new;
fitness = fitness_new;
%% 更新种群
[fitness, index] = sort(fitness);
for j = 1 : pop
pop_new(j, :) = pop_new(index(j), :);
end
%% 得到优化曲线
curve(i) = GBestF;
avcurve(i) = sum(curve) / length(curve);
end
%% 得到最优值
Best_pos = GBestX;
Best_score = curve(end);
References
[1] https://blog.csdn.net/kjm13182345320/article/details/129215161
[2] https://blog.csdn.net/kjm13182345320/article/details/128105718