Linear Model Linear Model

1. Linear Model linear model

B station video tutorial portal: PyTorch deep learning practice - linear model

1.1 Introduction to the problem

insert image description here

Suppose students spend x hours on the PyTorch tutorial and get y points on the final exam. So, if I study for 4 hours, what will my grade be?

insert image description here

Supervised learning : You can refer to two methods of machine learning - supervised learning and unsupervised learning (common understanding)

1.2 Select model

insert image description here

The parameter w is the weight, which is a random value, different w will lead to different results! As shown below:

insert image description here

1.3 Loss Loss

insert image description here

When we take W from 0, 1, 2, ... to observe the change of Loss :

insert image description here

insert image description here

insert image description here

insert image description here

insert image description here

We found that: Loss (w=2)both are 0, that is, there is no loss, indicating that this is the most ideal state (in actual situations, it is often not achieved).

1.4 Mean square error MSE

We are accustomed to using MSE (mean square error) instead of Loss (loss value), because it can be expressed more intuitively:

insert image description here

insert image description here

MSE : Predictive evaluation indicators MSE, RMSE, MAE, MAPE, SMAPE in machine learning

1.5 Code

import numpy as np
import matplotlib.pyplot as plt

x_data = [1.0, 2.0, 3.0]
y_data = [2.0, 4.0, 6.0]


def forward(x):
    return x * w


def loss(x, y):
    y_pred = forward(x)
    return (y_pred - y) ** 2


w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1):
    print("w=", round(w, 2))
    l_sum = 0
    for x_val, y_val in zip(x_data, y_data):
        y_pred_val = forward(x_val)
        loss_val = loss(x_val, y_val)
        l_sum += loss_val
        print('\t', round(x_val, 2), round(y_val, 2), round(y_pred_val, 2), round(loss_val, 2))
    print('MSE=', l_sum / 3)
    w_list.append(w)
    mse_list.append(l_sum / 3)

plt.plot(w_list, mse_list)
plt.ylabel('MSE')
plt.xlabel('W')
plt.show()
w= 0.0
	 1.0 2.0 0.0 4.0
	 2.0 4.0 0.0 16.0
	 3.0 6.0 0.0 36.0
MSE= 18.666666666666668
w= 0.1
	 1.0 2.0 0.1 3.61
	 2.0 4.0 0.2 14.44
	 3.0 6.0 0.3 32.49
MSE= 16.846666666666668
w= 0.2
	 1.0 2.0 0.2 3.24
	 2.0 4.0 0.4 12.96
	 3.0 6.0 0.6 29.16
MSE= 15.120000000000003
...

insert image description here

1.6 Replace the model

At the beginning, we used y ^ = x ∗ w \hat {y} = x * wy^=xw , if we want toadd an intercept b:y ^ = x ∗ w + b \hat {y} = x * w + by^=xw+b , what will happen to the result?

import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D

x_data = [1.0, 2.0, 3.0]
y_data = [2.0, 4.0, 6.0]


def forward(x):
    return x * w + b


def loss(x, y):
    y_pred = forward(x)
    return (y_pred - y) ** 2


mse_list = []
W = np.arange(0.0, 4.1, 0.1)
B = np.arange(-2.0, 2.1, 0.1)
[w, b] = np.meshgrid(W, B)

l_sum = 0
for x_val, y_val in zip(x_data, y_data):
    y_pred_val = forward(x_val)
    print(y_pred_val)
    loss_val = loss(x_val, y_val)
    l_sum += loss_val

fig = plt.figure()
ax = Axes3D(fig)

ax.set_xlabel("w")
ax.set_ylabel("b")
ax.text(0.2, 2, 43, "Cost Value")

surf = ax.plot_surface(w, b, l_sum / 3, cmap=plt.get_cmap('rainbow'))
fig.colorbar(surf, shrink=0.5, aspect=5)
plt.show()
[[-2.  -1.9 -1.8 ...  1.8  1.9  2. ]
 [-1.9 -1.8 -1.7 ...  1.9  2.   2.1]
 [-1.8 -1.7 -1.6 ...  2.   2.1  2.2]
 ...
 [ 1.8  1.9  2.  ...  5.6  5.7  5.8]
 [ 1.9  2.   2.1 ...  5.7  5.8  5.9]
 [ 2.   2.1  2.2 ...  5.8  5.9  6. ]]
[[-2.  -1.8 -1.6 ...  5.6  5.8  6. ]
 [-1.9 -1.7 -1.5 ...  5.7  5.9  6.1]
 [-1.8 -1.6 -1.4 ...  5.8  6.   6.2]
 ...
 [ 1.8  2.   2.2 ...  9.4  9.6  9.8]
 [ 1.9  2.1  2.3 ...  9.5  9.7  9.9]
 [ 2.   2.2  2.4 ...  9.6  9.8 10. ]]
[[-2.  -1.7 -1.4 ...  9.4  9.7 10. ]
 [-1.9 -1.6 -1.3 ...  9.5  9.8 10.1]
 [-1.8 -1.5 -1.2 ...  9.6  9.9 10.2]
 ...
 [ 1.8  2.1  2.4 ... 13.2 13.5 13.8]
 [ 1.9  2.2  2.5 ... 13.3 13.6 13.9]
 [ 2.   2.3  2.6 ... 13.4 13.7 14. ]]

insert image description here

Guess you like

Origin blog.csdn.net/m0_70885101/article/details/128126416