Regression prediction | MATLAB implements GA-RF genetic algorithm to optimize random forest algorithm, multi-input and single-output regression prediction (multi-indicator, multi-graph)

Regression prediction | MATLAB implements GA-RF genetic algorithm to optimize random forest algorithm, multi-input and single-output regression prediction (multi-indicator, multi-graph)

List of effects

1
2
3

basic introduction

Regression prediction | MATLAB implements GA-RF genetic algorithm to optimize random forest algorithm multi-input single-output regression prediction (multi-indicator, multi-map), input multiple features, output single variable, multi-input single-output regression prediction; multi-indicator evaluation, code
quality Extremely high; excel data, easy to replace, operating environment 2018 and above.

programming

  • Complete source code and data acquisition method: private message reply GA-RF genetic algorithm optimization random forest algorithm multi-input single-output regression prediction (multi-indicator, multi-graph) .
%%  清空环境变量
warning off             % 关闭报警信息
close all               % 关闭开启的图窗
clear                   % 清空变量
clc                     % 清空命令行

%%  导入数据
res = xlsread('data.xlsx');

%%  划分训练集和测试集
temp = randperm(103);

P_train = res(temp(1: 80), 1: 7)';
T_train = res(temp(1: 80), 8)';
M = size(P_train, 2);

P_test = res(temp(81: end), 1: 7)';
T_test = res(temp(81: end), 8)';
N = size(P_test, 2);

%%  数据归一化
[p_train, ps_input] = mapminmax(P_train, 0, 1);
p_test = mapminmax('apply', P_test, ps_input);

[t_train, ps_output] = mapminmax(T_train, 0, 1);
t_test = mapminmax('apply', T_test, ps_output);



%%  仿真测试
t_sim1 = sim(net, p_train);
t_sim2 = sim(net, p_test);

%%  数据反归一化
T_sim1 = mapminmax('reverse', t_sim1, ps_output);
T_sim2 = mapminmax('reverse', t_sim2, ps_output);

%%  均方根误差
error1 = sqrt(sum((T_sim1 - T_train).^2) ./ M);
error2 = sqrt(sum((T_sim2 - T_test ).^2) ./ N);



%%  相关指标计算
% 决定系数 R2
R1 = 1 - norm(T_train - T_sim1)^2 / norm(T_train - mean(T_train))^2;
R2 = 1 - norm(T_test -  T_sim2)^2 / norm(T_test -  mean(T_test ))^2;

disp(['训练集数据的R2为:', num2str(R1)])
disp(['测试集数据的R2为:', num2str(R2)])

% 平均绝对误差 MAE
mae1 = sum(abs(T_sim1 - T_train)) ./ M ;
mae2 = sum(abs(T_sim2 - T_test )) ./ N ;

disp(['训练集数据的MAE为:', num2str(mae1)])
disp(['测试集数据的MAE为:', num2str(mae2)])

% 平均相对误差 MBE
mbe1 = sum(T_sim1 - T_train) ./ M ;
mbe2 = sum(T_sim2 - T_test ) ./ N ;

disp(['训练集数据的MBE为:', num2str(mbe1)])
disp(['测试集数据的MBE为:', num2str(mbe2)])

References

[1] https://blog.csdn.net/kjm13182345320/article/details/129215161
[2] https://blog.csdn.net/kjm13182345320/article/details/128105718

Guess you like

Origin blog.csdn.net/kjm13182345320/article/details/132425430