ML之LS&OLS:LS&OLS算法的简介、论文、算法的改进(最佳子集选择法、前向逐步回归法)、代码实现等详细攻略

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_41185868/article/details/84791342

ML之LS&OLS:LS&OLS算法的简介、论文、算法的改进(最佳子集选择法、前向逐步回归法)、代码实现等详细攻略

LS&OLS算法的简介

  OLS是在大约200 年前(1806年)由高斯(Gauss)和法国数学家阿德里安- 马里· 勒让德(Legendre)提出的。

LS&OLS算法的论文

 

LS&OLS算法的算法的改进(最佳子集选择法、前向逐步回归法)

1、最佳子集选择法伪代码实现

Initialize: Out_of_sample_error = NULL
    Break X and Y into test and training sets
for i in range(number of columns in X):
    for each subset of X having i+1 columns:
        fit ordinary least squares model
    Out_of_sample_error.append(least error amoung subsets containing i+1 columns)
Pick the subset corresponding to least overall error

2、前向逐步回归法伪代码实现

Initialize: ColumnList = NULL
    Out-of-sample-error = NULL
    Break X and Y into test and training sets
For number of column in X:
    For each trialColumn (column not in ColumnList):
        Build submatrix of X using ColumnList + trialColumn
        Train OLS on submatrix and store RSS Error on test data
    ColumnList.append(trialColumn that minimizes RSS Error)
    Out-of-sample-error.append(minimum RSS Error)

LS&OLS算法的代码实现

猜你喜欢

转载自blog.csdn.net/qq_41185868/article/details/84791342