Classification prediction | MATLAB implements SCNGO-CNN-LSTM-Attention data classification prediction

Classification prediction | MATLAB implements SCNGO-CNN-LSTM-Attention data classification prediction

classification effect

insert image description here

4

Basic description

1. SCNGO-CNN-LSTM-Attention data classification and prediction program, improved algorithm, northern goshawk optimization algorithm that combines sine-cosine and refraction reverse learning; 2. Program platform: No Attention is applicable
to MATLAB version 2020 and above; fusion of Attention Matlab2023 version or above is required;
3. Data classification and prediction program based on the Northern Goshawk optimization algorithm (SCNGO), convolutional neural network (CNN) and long-term short-term memory network (LSTM) fusion attention mechanism based on fusion of sine-cosine and refraction reverse learning ;
Northern Goshawk Optimization (NGO) was proposed by MOHAMMAD DEHGHANI et al. in 2022. This algorithm simulates the hunting process of Northern Goshawk (prey identification and attack, chasing and escape).
The improvement strategy refers to the sparrow optimization algorithm, and the improvement points are as follows:
① Use the refraction reverse learning strategy to initialize the individual of the northern goshawk algorithm. The basic idea is to expand the search range by calculating the reverse solution of the current solution, so as to find a better solution ②Using
the sine-cosine strategy to replace the position update formula of the original goshawk algorithm in the survey stage;
③Improve the step size search factor of the sine-cosine strategy; the original step size search factor shows a linear decreasing trend, which is not conducive to further balance The global search and local development capabilities of the Northern Goshawk algorithm.
The program language is matlab, and the program can produce classification effect diagrams, iterative optimization diagrams, confusion matrix diagrams; evaluation indicators such as precision, recall rate, precision rate, and F1 score.
4. Optimize the learning rate, convolution kernel size, and the number of neurons through the SCNGO optimization algorithm. These three key parameters take the highest accuracy of the test set as the objective function. 4.
Draw: polar coordinate diagram of loss and accuracy iterative changes; Dot diagram, confusion matrix diagram; fitness curve; display: precision, recall rate, precision rate, F1 score and other evaluation indicators.
5. Fields of application:
It is suitable for various data classification scenarios, such as identification, diagnosis and classification of rolling bearing faults, transformer oil and gas faults, power system transmission line fault areas, insulators, distribution networks, power quality disturbances, and other fields.
Easy to use:
Import data directly using EXCEL form, without greatly modifying the program. There are detailed notes inside, easy to understand.

programming

  • Complete program and data acquisition method: Private message bloggers reply to MATLAB to realize SCNGO-CNN-LSTM-Attention data classification prediction ;
% The Whale Optimization Algorithm
function [Best_Cost,Best_pos,curve]=WOA(pop,Max_iter,lb,ub,dim,fobj)

% initialize position vector and score for the leader
Best_pos=zeros(1,dim);
Best_Cost=inf; %change this to -inf for maximization problems


%Initialize the positions of search agents
Positions=initialization(pop,dim,ub,lb);

curve=zeros(1,Max_iter);

t=0;% Loop counter

% Main loop
while t<Max_iter
    for i=1:size(Positions,1)
        
        % Return back the search agents that go beyond the boundaries of the search space
        Flag4ub=Positions(i,:)>ub;
        Flag4lb=Positions(i,:)<lb;
        Positions(i,:)=(Positions(i,:).*(~(Flag4ub+Flag4lb)))+ub.*Flag4ub+lb.*Flag4lb;
        
        % Calculate objective function for each search agent
        fitness=fobj(Positions(i,:));
        
        % Update the leader
        if fitness<Best_Cost % Change this to > for maximization problem
            Best_Cost=fitness; % Update alpha
            Best_pos=Positions(i,:);
        end
        
    end
    
    a=2-t*((2)/Max_iter); % a decreases linearly fron 2 to 0 in Eq. (2.3)
    
    % a2 linearly dicreases from -1 to -2 to calculate t in Eq. (3.12)
    a2=-1+t*((-1)/Max_iter);
    
    % Update the Position of search agents 
    for i=1:size(Positions,1)
        r1=rand(); % r1 is a random number in [0,1]
        r2=rand(); % r2 is a random number in [0,1]
        
        A=2*a*r1-a;  % Eq. (2.3) in the paper
        C=2*r2;      % Eq. (2.4) in the paper
        
        
        b=1;               %  parameters in Eq. (2.5)
        l=(a2-1)*rand+1;   %  parameters in Eq. (2.5)
        
        p = rand();        % p in Eq. (2.6)
        
        for j=1:size(Positions,2)
            
            if p<0.5   
                if abs(A)>=1
                    rand_leader_index = floor(pop*rand()+1);
                    X_rand = Positions(rand_leader_index, :);
                    D_X_rand=abs(C*X_rand(j)-Positions(i,j)); % Eq. (2.7)
                    Positions(i,j)=X_rand(j)-A*D_X_rand;      % Eq. (2.8)
                    
                elseif abs(A)<1
                    D_Leader=abs(C*Best_pos(j)-Positions(i,j)); % Eq. (2.1)
                    Positions(i,j)=Best_pos(j)-A*D_Leader;      % Eq. (2.2)
                end
                
            elseif p>=0.5
              
                distance2Leader=abs(Best_pos(j)-Positions(i,j));
                % Eq. (2.5)
                Positions(i,j)=distance2Leader*exp(b.*l).*cos(l.*2*pi)+Best_pos(j);
                
            end
            
        end
    end
    t=t+1;
    curve(t)=Best_Cost;
    [t Best_Cost]
end

References

[1] https://blog.csdn.net/kjm13182345320/article/details/129036772?spm=1001.2014.3001.5502
[2] https://blog.csdn.net/kjm13182345320/article/details/128690229

Guess you like

Origin blog.csdn.net/kjm13182345320/article/details/132418555