【线性回归算法】{4} ——自己创建多元线性回归模型

此处我们准备自己创建一个多元线性回归模型。


一、首先在 PlayML文件夹中写一个 metrics模块用于计算 R Squared:

import numpy as np
from math import sqrt

def r2_score(y_true, y_predict):
    """计算y_true和y_predict之间的R Square"""
    return 1 - mean_squared_error(y_true, y_predict) / np.var(y_true)

二、然后引入分割数据集的 model_selection模块:

import numpy as np

def train_test_split(X, y, test_ratio=0.2, seed=None):
    """将数据 X 和 y 按照test_ratio分割成X_train, X_test, y_train, y_test"""
    if seed:
        np.random.seed(seed)

    shuffled_indexes = np.random.permutation(len(X))

    test_size = int(len(X) * test_ratio)
    test_indexes = shuffled_indexes[:test_size]
    train_indexes = shuffled_indexes[test_size:]

    X_train = X[train_indexes]
    y_train = y[train_indexes]

    X_test = X[test_indexes]
    y_test = y[test_indexes]

    return X_train, X_test, y_train, y_test

三、接下来开始创建多元线性回归模型:

import numpy as np
from .metrics import r2_score

class LinearRegression:

    def __init__(self):
        """初始化Linear Regression模型"""
        self.coef_ = None
        self.intercept_ = None
        self._theta = None

    def fit_normal(self, X_train, y_train):
        """根据训练数据集X_train, y_train训练Linear Regression模型"""
        X_b = np.hstack([np.ones((len(X_train), 1)), X_train])
        self._theta = np.linalg.inv(X_b.T.dot(X_b)).dot(X_b.T).dot(y_train)

        self.intercept_ = self._theta[0]
        self.coef_ = self._theta[1:]

        return self

    def predict(self, X_predict):
        """给定待预测数据集X_predict,返回表示X_predict的结果向量"""
        X_b = np.hstack([np.ones((len(X_predict), 1)), X_predict])
        return X_b.dot(self._theta)

    def score(self, X_test, y_test):
        """根据测试数据集 X_test 和 y_test 确定当前模型的准确度"""
        y_predict = self.predict(X_test)
        return r2_score(y_test, y_predict)

    def __repr__(self):
        return "LinearRegression()"

四、最后使用刚才创建的多元线性回归模型,对波士顿房价数据集进行预测:

import numpy as np
from sklearn import datasets
from playML.model_selection import train_test_split
from playML.LinearRegression import LinearRegression

boston = datasets.load_boston()

X = boston.data
y = boston.target

X = X[y < 50.0]
y = y[y < 50.0]

X_train, X_test, y_train, y_test = train_test_split(X, y, seed=666)

reg = LinearRegression()
reg.fit_normal(X_train, y_train)

print(reg.coef_)
print(reg.intercept_)

Output:


看一下真值和预测值:

print(y_test)
reg.predict(X_test)

Output:


计算R Squared:

print(reg.score(X_test, y_test))

Output:


参考资料:bobo老师机器学习教程

发布了75 篇原创文章 · 获赞 267 · 访问量 5226

猜你喜欢

转载自blog.csdn.net/weixin_45961774/article/details/105153990