Multidimensional time series | MATLAB implements SCNGO-BiGRU-Attention multivariate time series prediction

Multidimensional time series | MATLAB implements SCNGO-BiGRU-Attention multivariate time series prediction

predictive effect

insert image description here
insert image description here
insert image description here

basic introduction

Multidimensional time series | MATLAB implements SCNGO-BiGRU-Attention multivariate time series forecasting.

Model description

MATLAB implements SCNGO-BiGRU-Attention multivariate time series forecasting
1. No Attention is applicable to MATLAB version 2020 and above; Fusion Attention requires Matlab2023 version or above;
2. Northern Goshawk optimization algorithm based on fusion sine cosine and refraction reverse learning ( SCNGO), bidirectional gated recurrent unit network (BiGRU) fusion attention mechanism 24-step multivariate time series regression prediction algorithm; 3. Multivariate feature
input, single sequence variable output, input the features of the previous day, and realize the next day's Forecast, 24 steps ahead forecast.
The two key parameters, the learning rate and the number of neurons, are optimized through the SCNGO optimization algorithm, with the minimum MAPE as the objective function.
Provide loss, RMSE iterative change polar diagram; network feature visualization diagram; test comparison diagram; fitness curve (if the accuracy of the first round is the highest, the fitness curve is a horizontal straight line). Provides the display of calculation results such as MAPE, RMSE, and MAE. .
4. Northern Goshawk Optimization (NGO) was proposed by MOHAMMAD DEHGHANI et al. in 2022. This algorithm simulates the hunting process of Northern Goshawk (prey identification and attack, chasing and escape).
The improvement strategy refers to the sparrow optimization algorithm, and the improvement points are as follows:
① Use the refraction reverse learning strategy to initialize the individual of the northern goshawk algorithm. The basic idea is to expand the search range by calculating the reverse solution of the current solution, so as to find a better solution ②Using
the sine-cosine strategy to replace the position update formula of the original goshawk algorithm in the survey stage;
③Improve the step size search factor of the sine-cosine strategy; the original step size search factor shows a linear decreasing trend, which is not conducive to further balance The global search and local development capabilities of the Northern Goshawk algorithm.
5. Applicable fields:
wind speed forecast, photovoltaic power forecast, power generation forecast, carbon price forecast and other applications.
6. Easy to use:
directly use the EXCEL form to import data, without greatly modifying the program. There are detailed notes inside, easy to understand.

programming

  • Complete program and data acquisition method 1: program exchange of equal value;
  • Complete program and data acquisition method 2: private letter bloggers reply to MATLAB to realize SCNGO-BiGRU-Attention multivariate time series prediction acquisition.
 
        gruLayer(32,'OutputMode',"last",'Name','bil4','RecurrentWeightsInitializer','He','InputWeightsInitializer','He')
        dropoutLayer(0.25,'Name','drop2')
        % 全连接层
        fullyConnectedLayer(numResponses,'Name','fc')
        regressionLayer('Name','output')    ];

    layers = layerGraph(layers);
    layers = connectLayers(layers,'fold/miniBatchSize','unfold/miniBatchSize');
%-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%% 训练选项
if gpuDeviceCount>0
    mydevice = 'gpu';
else
    mydevice = 'cpu';
end
    options = trainingOptions('adam', ...
        'MaxEpochs',MaxEpochs, ...
        'MiniBatchSize',MiniBatchSize, ...
        'GradientThreshold',1, ...
        'InitialLearnRate',learningrate, ...
        'LearnRateSchedule','piecewise', ...
        'LearnRateDropPeriod',56, ...
        'LearnRateDropFactor',0.25, ...
        'L2Regularization',1e-3,...
        'GradientDecayFactor',0.95,...
        'Verbose',false, ...
        'Shuffle',"every-epoch",...
        'ExecutionEnvironment',mydevice,...
        'Plots','training-progress');
%% 模型训练
rng(0);
net = trainNetwork(XrTrain,YrTrain,layers,options);
%-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
%% 测试数据预测
% 测试集预测
YPred = predict(net,XrTest,"ExecutionEnvironment",mydevice,"MiniBatchSize",numFeatures);
YPred = YPred';
% 数据反归一化
YPred = sig.*YPred + mu;
YTest = sig.*YTest + mu;
————————————————
版权声明:本文为CSDN博主「机器学习之心」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。

References

[1] http://t.csdn.cn/pCWSp
[2] https://download.csdn.net/download/kjm13182345320/87568090?spm=1001.2014.3001.5501
[3] https://blog.csdn.net/kjm13182345320/article/details/129433463?spm=1001.2014.3001.5501

Guess you like

Origin blog.csdn.net/kjm13182345320/article/details/132418272