Pytorch 4.6 练手

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/vancooler/article/details/89061016

一个实现线性回归的pytorch 代码,由于之前已经安装好,此处不在赘述具体过程。

'''
继承nn.Module实现全连接层
'''
import torch as t
from torch import  nn

class Linear(nn.Module):
    def __init__(self, in_features, out_features):
        #在构造函数中定义自己的需要更新的参数,封装成 parameter
        #Parameter是默认 requires_grad = True
        super(Linear, self).__init__() #等价于 nn.Module.__init__(self)
        self.w = nn.Parameter(t.randn(in_features, out_features))
        self.b = nn.Parameter(t.randn(out_features))

    def forward(self, x):
        output = x.mm(self.w) + self.b
        return output

#main
if __name__ == "__main__":
    layer =Linear(4,3)
    X = t.randn(2,4)
    print(X)
    output = layer(X)
    print(output)
    for name, parameter in layer.named_parameters():
        print(name,parameter)

猜你喜欢

转载自blog.csdn.net/vancooler/article/details/89061016
4.6