Regression prediction | MATLAB implements regression prediction based on ELM-Adaboost extreme learning machine combined with AdaBoost multi-input single-output

Regression prediction | MATLAB implements regression prediction based on ELM-Adaboost extreme learning machine combined with AdaBoost multi-input single-output

predictive effect

1
2

basic introduction

1. MatLab implements the combination of ELM-ADABOOST extreme learning machine combined with ADABOOST multi-input single output regression forecast;
2. The operating environment is MATLAB2018B;
3. Enter multiple features, output a single variable, and multi-variable return prediction;
4.DATA is a data set, excel data, the first 7 column output, main.m is the main output, main.m is the main output, main.m is the main output, main.m is the main output. Program, run, all files are placed in a folder; 5. The command window outputs R2, MAE, MAE, MAPE, and MBE multi -indicator evaluation
.

Model description

ELM-Adaboost extreme learning machine combined with AdaBoost multiple-input single-output regression prediction is a prediction method based on machine learning and integrated learning.
The specific process is as follows:
Data preprocessing: preprocessing steps such as cleaning, normalization, and segmentation are performed on the original data.
Feature extraction: use the ELM model to extract features from the data, and obtain multiple feature vectors as the input of the AdaBoost algorithm.
AdaBoost model training: Use the AdaBoost algorithm to weight and combine multiple feature vectors to obtain the final prediction result.
Model evaluation: evaluate the prediction results, including the mean square error (MSE), mean absolute error (MAE) and other indicators.
Model optimization: optimize the model according to the evaluation results, you can try to adjust the parameters of the ELM model, change the parameters of the AdaBoost algorithm, etc.
Forecasting application: apply the optimized model to actual forecasting tasks for real-time forecasting.
The advantage of this method is that the ELM model can extract data features, and the AdaBoost algorithm can effectively use multiple feature vectors for weighted combination to improve prediction accuracy. At the same time, this method is not only applicable to the prediction task of a single data source, but also can be applied to the integrated prediction task of multiple data sources. The disadvantage is that this method requires high data volume and computing resources, and requires a large amount of training data and computing power.

programming

  • Complete source code and data acquisition method: private message reply ELM-Adaboost extreme learning machine combined with AdaBoost multi-input single-output regression prediction .
%% 预测
t_sim1 = predict(net, p_train); 
t_sim2 = predict(net, p_test ); 

%%  数据反归一化
T_sim1 = mapminmax('reverse', t_sim1, ps_output);
T_sim2 = mapminmax('reverse', t_sim2, ps_output);

%%  均方根误差
error1 = sqrt(sum((T_sim1' - T_train).^2) ./ M);
error2 = sqrt(sum((T_sim2' - T_test ).^2) ./ N);


%%  相关指标计算
%  R2
R1 = 1 - norm(T_train - T_sim1')^2 / norm(T_train - mean(T_train))^2;
R2 = 1 - norm(T_test  - T_sim2')^2 / norm(T_test  - mean(T_test ))^2;

disp(['训练集数据的R2为:', num2str(R1)])
disp(['测试集数据的R2为:', num2str(R2)])

%  MAE
mae1 = sum(abs(T_sim1' - T_train)) ./ M ;
mae2 = sum(abs(T_sim2' - T_test )) ./ N ;

disp(['训练集数据的MAE为:', num2str(mae1)])
disp(['测试集数据的MAE为:', num2str(mae2)])

%% 平均绝对百分比误差MAPE
MAPE1 = mean(abs((T_train - T_sim1')./T_train));
MAPE2 = mean(abs((T_test - T_sim2')./T_test));

disp(['训练集数据的MAPE为:', num2str(MAPE1)])
disp(['测试集数据的MAPE为:', num2str(MAPE2)])

%  MBE
mbe1 = sum(abs(T_sim1' - T_train)) ./ M ;
mbe2 = sum(abs(T_sim1' - T_train)) ./ N ;

disp(['训练集数据的MBE为:', num2str(mbe1)])
disp(['测试集数据的MBE为:', num2str(mbe2)])

%均方误差 MSE
mse1 = sum((T_sim1' - T_train).^2)./M;
mse2 = sum((T_sim2' - T_test).^2)./N;

disp(['训练集数据的MSE为:', num2str(mse1)])
disp(['测试集数据的MSE为:', num2str(mse2)])

References

[1] https://blog.csdn.net/kjm13182345320/article/details/128577926?spm=1001.2014.3001.5501
[2] https://blog.csdn.net/kjm13182345320/article/details/128573597?spm=1001.2014.3001.5501

Guess you like

Origin blog.csdn.net/kjm13182345320/article/details/131813355