Regression prediction | MATLAB implements WOA-ELM whale algorithm to optimize extreme learning machine multi-input single-output regression prediction
Table of contents
List of effects
basic introduction
Matlab implements WOA-ELM whale algorithm to optimize extreme learning machine multi-input regression prediction (complete source code and data)
1. Matlab implements WOA-ELM whale algorithm to optimize extreme learning machine multi-input single-output regression prediction (complete source code and data)
2. Multiple inputs 3. Multi-input and single-output regression prediction;
3. Multi-indicator evaluation, evaluation indicators include: R2, MAE, MSE, RMSE, etc., code quality is extremely high;
4. Whale algorithm optimization parameters are: optimization parameters are weights Value and threshold;
5.excel data, easy to replace, operating environment 2018 and above.
programming
- Complete program and data download method 1 (download directly from the resource): MATLAB implements WOA-ELM whale algorithm to optimize extreme learning machine multi-input single-output regression prediction
- Complete program and data download method 2 (subscribe to the "ELM Extreme Learning Machine" column, and at the same time you can read all the content included in the "ELM Extreme Learning Machine" column, and private message me to get the data after subscription): MATLAB implements WOA-ELM whale algorithm to optimize extreme learning Multi-Input Single-Output Regression Forecasting
- Complete program and data download method 3 (subscribe to the "Smart Learning" column, and at the same time get 3 copies of the program included in the "Smart Learning" column, private message me to get the data after subscription): MATLAB implements WOA-ELM whale algorithm to optimize extreme learning machine Multiple input single output regression prediction
%% 计算初始适应度值
fitness = zeros(1, pop);
for i = 1 : pop
fitness(i) = fobj(pop_new(i, :));
end
%% 得到全局最优适应度值
[fitness, index]= sort(fitness);
GBestF = fitness(1);
%% 得到全局最优种群
for i = 1 : pop
pop_new(i, :) = pop_lsat(index(i), :);
end
GBestX = pop_new(1, :);
X_new = pop_new;
%% 优化算法
for i = 1: Max_iter
BestF = fitness(1);
R2 = rand(1);
for j = 1 : PDNumber
if(R2 < ST)
X_new(j, :) = pop_new(j, :) .* exp(-j / (rand(1) * Max_iter));
else
X_new(j, :) = pop_new(j, :) + randn() * ones(1, dim);
end
end
for j = PDNumber + 1 : pop
if(j > (pop - PDNumber) / 2 + PDNumber)
X_new(j, :) = randn() .* exp((pop_new(end, :) - pop_new(j, :)) / j^2);
else
A = ones(1, dim);
for a = 1 : dim
if(rand() > 0.5)
A(a) = -1;
end
end
AA = A' / (A * A');
X_new(j, :) = pop_new(1, :) + abs(pop_new(j, :) - pop_new(1, :)) .* AA';
end
end
Temp = randperm(pop);
SDchooseIndex = Temp(1 : SDNumber);
for j = 1 : SDNumber
if(fitness(SDchooseIndex(j)) > BestF)
X_new(SDchooseIndex(j), :) = pop_new(1, :) + randn() .* abs(pop_new(SDchooseIndex(j), :) - pop_new(1, :));
elseif(fitness(SDchooseIndex(j)) == BestF)
K = 2 * rand() -1;
X_new(SDchooseIndex(j), :) = pop_new(SDchooseIndex(j), :) + K .* (abs(pop_new(SDchooseIndex(j), :) - ...
pop_new(end, :)) ./ (fitness(SDchooseIndex(j)) - fitness(end) + 10^-8));
end
end
%% 边界控制
for j = 1 : pop
for a = 1 : dim
if(X_new(j, a) > ub(a))
X_new(j, a) = ub(a);
end
if(X_new(j, a) < lb(a))
X_new(j, a) = lb(a);
end
end
end
%% 获取适应度值
for j = 1 : pop
fitness_new(j) = fobj(X_new(j, :));
end
%% 获取最优种群
for j = 1 : pop
if(fitness_new(j) < GBestF)
GBestF = fitness_new(j);
GBestX = X_new(j, :);
end
end
%% 更新种群和适应度值
pop_new = X_new;
fitness = fitness_new;
%% 更新种群
[fitness, index] = sort(fitness);
for j = 1 : pop
pop_new(j, :) = pop_new(index(j), :);
end
%% 得到优化曲线
curve(i) = GBestF;
avcurve(i) = sum(curve) / length(curve);
end
%% 得到最优值
Best_pos = GBestX;
Best_score = curve(end);
References
[1] https://blog.csdn.net/kjm13182345320/article/details/129215161
[2] https://blog.csdn.net/kjm13182345320/article/details/128105718