Detailed Explanation of Variable

Variable

At present, the official has abandoned variable, tensor can directly set requires_grad=True

Source : torch.autograd.Variable()

(1) Features

  • Variable is a variable that can change continuously, conforming to backpropagation, automatic derivation, and parameter update attributes, and has no essential difference from tensor other than that .
  • Convert tensor to Variable through torch_data (torch_data)
  • variable is not derived by default (requires_grad attribute defaults to False)

(2) Composition attributes

  • data : Get the tensor value of the object
  • grad : get the backpropagation gradient
  • requires_grad : Do you need to ask for gradient

(3) Code display

from torch.autograd import Variable  
  
x = Variable(torch.Tensor([3]), requires_grad=True)  
a = Variable(torch.Tensor([5]), requires_grad=True)  
bias = Variable(torch.Tensor([9]), requires_grad=True)  
c = Variable(torch.Tensor([12]), requires_grad=False) # 设置一个不需求导做对比  
# 构建一个计算图  
y = a * x + bias * c  # y = a * x + bias * c= 5 * 3 + 9 * 12  
# 反向传播  
y.backward()  # same as y.backward(torch.FloatTensor([i]))  
  
print(x.data, x.grad, x.requires_grad)  # tensor([3.]) tensor([5.]) True  
print(a.data, a.grad, a.requires_grad)  # tensor([5.]) tensor([3.]) True  
print(bias.data, bias.grad, bias.requires_grad)  # tensor([9.]) tensor([12.]) True  
print(c.data, c.grad, c.requires_grad)  # tensor([12.]) None False

Guess you like

Origin blog.csdn.net/m0_52910424/article/details/126942482