[LSSVM prediction] bat algorithm improved least squares support vector machine llSSVM prediction [Matlab 010]

1. Introduction

Features of LSSVM
  1) The same is to solve the original dual problem, but by solving a linear equation set (caused by the linear constraints in the optimization objective) instead of the QP problem in the SVM (simplifying the solution process), for the high-dimensional input space The same applies to classification and regression tasks;
  2) Essentially the process of solving linear matrix equations, combined with the core version of Gaussian processes, regularization networks and Fisher discriminant analysis ;
  3) Use sparse approximation (to overcome the drawbacks of using this algorithm) and robust regression (robust statistics);
  4) Use Bayesian inference;
  5) Can be extended to unsupervised learning: kernel Principal component analysis (kernel PCA) or density clustering;
  6) Can be extended to recurrent neural networks.
LSSVM is used for classification tasks
  1) Optimization objective
Insert picture description here
 2) Lagrangian multiplier method
Insert picture description here
 where α i \alpha_iαi​ is the Lagrangian multiplier, which is also the support value (support values)
  3) Solving the optimization conditions
Insert picture description here
4) Solving the dual problem (Like SVM does not do any calculations on w ww and e ee)
Insert picture description here
  LLSVM obtains the values ​​of the optimized variables a aa and b bb by solving the above linear equations. This solution method is easier than solving the QP problem.
  5) Difference from standard SVM
    a. Use equality constraints instead of inequality constraints;
    b. Since equality constraints are used for each sample point, no constraints are imposed on the relaxation vector, which is also an important reason for the loss of sparsity in LSSVM;
    c. By solving the equality constraints and the least squares problem, the problem is further improved simplify.

LSSVM is used for regression tasks
  1) Problem description
Insert picture description here
The disadvantages of LSSVM
  noticed when solving classification tasks, in the process of solving optimization, α i = γ ei \alpha_{i}=\gamma{e_{i}}αi​=γei , Since the Lagrange multiplier α i ≠ 0 \alpha_{i}\neq{0}αi​̸​=0 corresponding to the equality constraint in the Lagrange multiplier method, all training samples will be taken as Looking at the support vector, this will cause it to lose the original sparse nature of the SVM, but the training set can also be pruning based on support to achieve the purpose of sparseness. This step can also be seen Doing is a sparse approximate operation.

Second, the source code

%=====================================================================
%初始化
clc
close all
clear
format long
tic
%==============================================================
%%导入数据
data=xlsread('1.xlsx');
[row,col]=size(data);
x=data(:,1:col-1);
y=data(:,col);
set=1; %设置测量样本数
row1=row-set;%
train_x=x(1:row1,:);
train_y=y(1:row1,:);
test_x=x(row1+1:row,:);%预测输入
test_y=y(row1+1:row,:);%预测输出
train_x=train_x';
train_y=train_y';
test_x=test_x';
test_y=test_y';
 
%%数据归一化
[train_x,minx,maxx, train_yy,miny,maxy] =premnmx(train_x,train_y);
test_x=tramnmx(test_x,minx,maxx);
train_x=train_x';
train_yy=train_yy';
train_y=train_y';
test_x=test_x';
test_y=test_y';
%% 参数初始化
eps = 10^(-6);
%%定义lssvm相关参数
type='f';
kernel = 'RBF_kernel';
proprecess='proprecess';
lb=[0.01 0.02];%参数c、g的变化的下限
ub=[1000 100];%参数c、g的变化的上限
dim=2;%维度,即一个优化参数
SearchAgents_no=20; % Number of search agents
Max_iter=100; % Maximum numbef of iterations
n=10;      % Population size, typically 10 to 25
A=0.25;      % Loudness  (constant or decreasing)
r=0.5;      % Pulse rate (constant or decreasing)
% This frequency range determines the scalings
Qmin=0;         % Frequency minimum
Qmax=2;         % Frequency maximum
% Iteration parameters
tol=10^(-10);    % Stop tolerance
Leader_pos=zeros(1,dim);
Leader_score=inf; %change this to -inf for maximization problems
%Initialize the positions of search agents
for i=1:SearchAgents_no
    Positions(i,1)=ceil(rand(1)*(ub(1)-lb(1))+lb(1));
    Positions(i,2)=ceil(rand(1)*(ub(2)-lb(2))+lb(2));
    Fitness(i)=Fun(Positions(i,:),train_x,train_yy,type,kernel,proprecess,miny,maxy,train_y,test_x,test_y);
v(i,:)=rand(1,dim);
end
[fmin,I]=min(Fitness);
best=Positions(I,:);
Convergence_curve=zeros(1,Max_iter);
t=0;% Loop counter
% Start the iterations -- Bat Algorithm
 
 
%% 结果分析
plot( Convergence_curve,'LineWidth',2);
title(['灰狼优化算法适应度曲线','(参数c1=',num2str(Leader_pos(1)),',c2=',num2str(Leader_pos(2)),',终止代数=',num2str(Max_iter),')'],'FontSize',13);
xlabel('进化代数');ylabel('误差适应度');
 
bestc = Leader_pos(1);
bestg = Leader_pos(2);
 
 
end
RD=RD'
disp(['灰狼优化算法优化svm预测误差=',num2str(D)])
 
% figure
% plot(test_predict,':og')
% hold on
% plot(test_y,'- *')
% legend('预测输出','期望输出')
% title('网络预测输出','fontsize',12)
% ylabel('函数输出','fontsize',12)
% xlabel('样本','fontsize',12)
figure
plot(train_predict,':og')
hold on
plot(train_y,'- *')
legend('预测输出','期望输出')
title('灰狼优化svm网络预测输出','fontsize',12)
ylabel('函数输出','fontsize',12)
xlabel('样本','fontsize',12)
 
toc   %计算时间
 

Three, running results

Insert picture description here
Insert picture description here

Four, remarks

Complete code or writing add QQ912100926 past review
>>>>>>
[Prediction model] lssvm prediction model of particle swarm [Matlab 005]
[lssvm prediction] lssvm prediction of whale optimization algorithm [Matlab 006]
[SVM prediction] SVM prediction model of bat algorithm [Matlab 007]
[SVM prediction] Gray wolf algorithm optimization of SVM support vector machine prediction model [Matlab 008]
[Prediction model] Prediction model based on BP neural network [Matlab 009]

Guess you like

Origin blog.csdn.net/m0_54742769/article/details/112920191