[Lssvm prediction model] based on bat algorithm improved least squares support vector machine lssvm prediction [Matlab 109] [prediction model 7]

Features of LSSVM
  1) The same is to solve the original dual problem, but by solving a linear equation set (caused by the linear constraints in the optimization objective) instead of the QP problem in the SVM (simplifying the solution process), for the high-dimensional input space The same applies to classification and regression tasks;
  2) Essentially the process of solving linear matrix equations, combined with the core version of Gaussian processes, regularization networks and Fisher discriminant analysis ;
  3) Use sparse approximation (to overcome the drawbacks of using this algorithm) and robust regression (robust statistics);
  4) Use Bayesian inference;
  5) Can be extended to unsupervised learning: kernel Principal component analysis (kernel PCA) or density clustering;
  6) Can be extended to recurrent neural networks.

LSSVM is used for classification tasks
  1) Optimization objective
  Insert picture description here
 2) Lagrange multiplier method

Insert picture description here
 Among them α i \alpha_iαi​ is the Lagrangian multiplier, which is also the support value (support values)
  3) Solve the optimization condition
  Insert picture description here
4) Solve the dual problem (same as SVM does not do any calculations on w ww and e ee)
Insert picture description here
 LLSVM solves the above Linear equations to obtain the values ​​of the optimized variables a aa and b bb. This solution method is easier than solving the QP problem

5) Difference from standard SVM

a. Use equality constraints instead of inequality constraints;
    b. Since equality constraints are used for each sample point, no constraints are imposed on the slack vector, which is also an important reason why LSSVM loses sparsity;
    c. Through solving, etc. Formula constraints and least squares problem make the problem further simplified.

LSSVM is used for regression tasks
  1) Problem description
  Insert picture description here
The drawbacks of LSSVM
  noticed when solving classification tasks, in the process of solving optimization, α i = γ ei \alpha_{i}=\gamma{e_{i}}αi​=γei , Since the Lagrange multiplier α i ≠ 0 \alpha_{i}\neq{0}αi​̸​=0 corresponding to the equality constraint in the Lagrange multiplier method, all training samples will be taken as Looking at the support vector, this will cause it to lose the original sparse nature of the SVM, but the training set can also be pruning based on support to achieve the purpose of sparseness. This step can also be seen Doing is a sparse approximate operation.

%=====================================================================
%初始化
clc
close all
clear
format long
tic
%==============================================================
%%导入数据
data=xlsread('1.xlsx');
[row,col]=size(data);
x=data(:,1:col-1);
y=data(:,col);
set=1; %设置测量样本数
row1=row-set;%
train_x=x(1:row1,:);
train_y=y(1:row1,:);
test_x=x(row1+1:row,:);%预测输入
test_y=y(row1+1:row,:);%预测输出
train_x=train_x';
train_y=train_y';
test_x=test_x';
test_y=test_y';
 
%%数据归一化
[train_x,minx,maxx, train_yy,miny,maxy] =premnmx(train_x,train_y);
test_x=tramnmx(test_x,minx,maxx);
train_x=train_x';
train_yy=train_yy';
train_y=train_y';
test_x=test_x';
test_y=test_y';
%% 参数初始化
eps = 10^(-6);
%%定义lssvm相关参数
type='f';
kernel = 'RBF_kernel';
proprecess='proprecess';
lb=[0.01 0.02];%参数c、g的变化的下限
ub=[1000 100];%参数c、g的变化的上限
dim=2;%维度,即一个优化参数
SearchAgents_no=20; % Number of search agents
Max_iter=100; % Maximum numbef of iterations
n=10;      % Population size, typically 10 to 25
A=0.25;      % Loudness  (constant or decreasing)
r=0.5;      % Pulse rate (constant or decreasing)
% This frequency range determines the scalings
Qmin=0;         % Frequency minimum
Qmax=2;         % Frequency maximum
% Iteration parameters
tol=10^(-10);    % Stop tolerance
Leader_pos=zeros(1,dim);
Leader_score=inf; %change this to -inf for maximization problems
%Initialize the positions of search agents
for i=1:SearchAgents_no
    Positions(i,1)=ceil(rand(1)*(ub(1)-lb(1))+lb(1));
    Positions(i,2)=ceil(rand(1)*(ub(2)-lb(2))+lb(2));
    Fitness(i)=Fun(Positions(i,:),train_x,train_yy,type,kernel,proprecess,miny,maxy,train_y,test_x,test_y);
v(i,:)=rand(1,dim);
end
[fmin,I]=min(Fitness);
best=Positions(I,:);
Convergence_curve=zeros(1,Max_iter);
t=0;% Loop counter
% Start the iterations -- Bat Algorithm
 
 
%% 结果分析
plot( Convergence_curve,'LineWidth',2);
title(['灰狼优化算法适应度曲线','(参数c1=',num2str(Leader_pos(1)),',c2=',num2str(Leader_pos(2)),',终止代数=',num2str(Max_iter),')'],'FontSize',13);
xlabel('进化代数');ylabel('误差适应度');
 
bestc = Leader_pos(1);
bestg = Leader_pos(2);
 
 
end
RD=RD'
disp(['灰狼优化算法优化svm预测误差=',num2str(D)])
 
% figure
% plot(test_predict,':og')
% hold on
% plot(test_y,'- *')
% legend('预测输出','期望输出')
% title('网络预测输出','fontsize',12)
% ylabel('函数输出','fontsize',12)
% xlabel('样本','fontsize',12)
figure
plot(train_predict,':og')
hold on
plot(train_y,'- *')
legend('预测输出','期望输出')
title('灰狼优化svm网络预测输出','fontsize',12)
ylabel('函数输出','fontsize',12)
xlabel('样本','fontsize',12)
 
toc   %计算时间
 

Insert picture description here
Insert picture description here
Note: complete code or writing add QQ2449341593 past review
>>>>>>
[lssvm prediction] based on whale optimization algorithm lssvm data prediction matlab source code [Matlab 104 period] [prediction model 2]
[lstm prediction] based on whale optimization algorithm Improved lstm prediction matlab source code [Matlab 105 issue] [Prediction model 3]
[SVM prediction] Improved SVM prediction based on bat algorithm [Matlab 106 issue] [Prediction model 4]
[SVM prediction] Gray wolf algorithm optimization svm support vector machine prediction Matlab source code [Matlab 107 period] [Prediction model 5]
BP neural network prediction [Matlab 108 period] [Prediction model 6]

Guess you like

Origin blog.csdn.net/TIQCmatlab/article/details/112908010