Bayesian Optimal Kernel Extreme Learning Machine KELM for Regression Prediction

0. Foreword

        Kernel extreme learning machine KELM is favored in classification and regression prediction because of its powerful learning ability and generalization performance. The kernel extreme learning machine KELM and the hybrid kernel extreme learning machine HKELM are compared.

1. Basic principles

 1.1 Principle of KELM    

        The principle of KELM has been mentioned in the previous blog, please click here

        KELM is a kernel method. After the original data is mapped to a high-dimensional kernel space, the dot product operation between samples can be realized directly based on the kernel function. Therefore, the relationship between the samples has a direct causal relationship with the selection of the kernel function. Commonly used kernel functions are: lin_kernel, poly_kernel, RBF_kernel, wav_kernel, different kernel functions are selected, and the kernel parameters that need to be determined are also different.

       After the kernel function is determined, the calculation result of the kernel function is directly affected by the kernel parameter setting, and the most suitable parameter is determined by an optimization algorithm.

1.2. The principle of hybrid kernel extreme learning machine HKELM

     HKELM, as the name implies, uses at least two kernel functions instead of a single kernel function to enhance the generalization performance of the model. Therefore, when calculating the kernel matrix, it is necessary to calculate the values ​​of the two kernel functions and perform weighted summation.

3. Realize the effect

3.1 Realization of KELM regression prediction

Select the kernel function and set the kernel parameters:

%% 正则化系数与核参数进行设置
kernel='RBF_kernel';%核函数类型1.RBF_kernel 2.lin_kernel 3 poly_kernel
ker1=1;%RBF核的核参数
lambda=10; %正则系数

 The result is as follows:

root mean square error rmse = 0.0115
mean absolute error mae = 0.0091

3.2 Realization of HKELM regression prediction

Select two kernel functions, and set the corresponding parameters and weight ratios of the two kernel functions

%% 正则化系数与核参数进行设置
kernel1='RBF_kernel';%核函数类型1.RBF_kernel 2.lin_kernel 3 poly_kernel
kernel2='poly_kernel';%核函数类型1.RBF_kernel 2.lin_kernel 3 poly_kernel
ker1=1;%RBF核的核参数
ker2=[1 2];%多项式核的核参数
lambda=10; %正则系数
w=0.5;%混合核里面rbf的权重,多项式核就是1-w

     

  

 root mean square error
rmse =
    0.0110
mean absolute error
mae =
    0.0085

3.3 Implementation of Bayesian optimization kernel extreme learning machine regression prediction

    Select the kernel function and determine the range of kernel parameters to be optimized, as shown in the figure below:

% 核参数设置  详情看kernel_matrix
if strcmp(kernel1,'lin_kernel')
    1;%如果是线性核 则没有核参数
elseif strcmp(kernel1,'RBF_kernel')
    optimVars=[optimVars;
                optimizableVariable('rbf',[1e-3 1e3]);];%如果是rbf核,则有一个核参数,范围是[1e-3 1e3]
elseif strcmp(kernel1,'poly_kernel')
    optimVars=[optimVars;
                optimizableVariable('poly1',[1e-3 1e3]);
                optimizableVariable('poly2',[1 10],'Type','integer');];%如果是多项式核,则有2个核参数,且第二个是幂指数,取整
elseif strcmp(kernel1,'wav_kernel')
    optimVars=[optimVars;
                optimizableVariable('wav1',[1e-3 1e3]);
                optimizableVariable('wav2',[1e-3 1e3]);
                optimizableVariable('wav3',[1e-3 1e3])];%小波核有3个核参数
end

 

 

 root mean square error
rmse =
    0.0106
mean absolute error
mae =
    0.0082

4. Comparative analysis

 Comparative analysis of performance indicators:

KELM
root mean square error (RMSE): 0.011518
mean absolute error (MAE): 0.0090813
mean relative percentage error (MAPE): 1.8215%
R square coefficient of determination (R2): 0.95494

HKELM
root mean square error (RMSE): 0.010991
mean absolute error (MAE): 0.0085007
mean relative percentage error (MAPE): 1.6877%
R square coefficient of determination (R2): 0.96041

BYS-KELM
root mean square error (RMSE): 0.01063
mean absolute error (MAE): 0.008195
mean relative percentage error (MAPE): 1.6523%
R square coefficient of determination (R2): 0.96162

Guess you like

Origin blog.csdn.net/m0_61363749/article/details/126159515