a leaf Variable that requires grad has been used in an in-place operation

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/jacke121/article/details/82733407

a leaf Variable that requires grad has been used in an in-place operation

这个是因为写成了x+=2,

改为y = x + 2

此时如果是写y+=2是可以的,也就是说torch变量带requires_grad 的不能进行+=操作

import numpy as np

import torch
from torch.autograd import Variable


x = Variable(torch.ones(2,2),requires_grad=True)
x+=2
y = x + 2
# print(x.creator)      # None,用户直接创建没有creater属性
# print(y.creator)      # <torch.autograd._functions.basic_ops.AddConstant object at 0x7fb9b4d4b208>

z = y*y*3
out = z.mean()

out.backward()

print(x,y,z)
print(x.grad)          # 输出对out对x求倒结果
print(y.grad)          # y不是自动求导变量

这个也会报错:

import numpy as np

import torch
from torch.autograd import Variable

x = torch.ones(2,2,requires_grad=True)

y= torch.ones(2,2,requires_grad=False)

x[np.array(y)]=0
# print(x.creator)      # None,用户直接创建没有creater属性
# print(y.creator)      # <torch.autograd._functions.basic_ops.AddConstant object at 0x7fb9b4d4b208>
print(x)

原因:如果自动求导,不能直接赋值,不求导的可以赋值

猜你喜欢

转载自blog.csdn.net/jacke121/article/details/82733407