Linear Regression with PyTorch

Linear Regression with PyTorch

Problem Description

初始化一组数据 \((x,y)\),使其满足这样的线性关系 \(y = w x + b\) 。然后基于反向传播法,用均方误差(mean squared error)去拟合这组数据。

Notice

self.prediction = torch.nn.Linear(1, 1)

这一行代码,实际是维护了两个变量,其描述了这样的一种关系:

\[prediction_{1\times1} = weight_{1\times1} \times input_{1\times1} + bias_{1\times1}\]

其中,每个参数都是 \(1\times1\) 维的。

Code

import torch


epoch = 10000
lr = 0.01
w = 10
b = 5

x = torch.unsqueeze(torch.linspace(1, 10, 20), 1)
y = w*x + b + torch.rand(x.size())


class Net(torch.nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.prediction = torch.nn.Linear(1, 1)

    def forward(self, x):
        out = self.prediction(x)
        return out


net = Net()
optimizer = torch.optim.Adam(net.parameters(), lr=lr)
criticism = torch.nn.MSELoss()


for i in range(epoch):
    y_pred = net(x)
    loss = criticism(y_pred, y)  # 先是 y_pred 然后是 y_true 参数顺序不能乱

    optimizer.zero_grad()
    loss.backward()
    optimizer.step()

print(loss.data)
print(net.state_dict()['prediction.weight'])
print(net.state_dict()['prediction.bias'])

输出:

tensor(1.00000e-07 *
       5.3597)
tensor([[ 1.2002]])
tensor([ 0.9984])

猜你喜欢

转载自www.cnblogs.com/fengyubo/p/9164970.html