pytorch (3) ---- basic types of computing FIG Autograd

After pytorch 0.4 version, torch.autograd.Variable and torch.Tensor were integrated.

Fundamental contents

1, you can create an automatic derivation of Tensor, the default is false. requires_grad property

2, Tensor even attributes:

      grad: Tensor record corresponding to the gradient

      grad_fn: pointing the object function, the recording operation of the Tensor

3, FIG calculation, the root node, the leaf node, an intermediate node; determining whether the node is a leaf (.is_leaf)

 

 

 

4, the leaf nodes gradient calculation, using the root .backward () function

The sample code

1、

# Requires_grad property, to resort guide 
A = torch.randn (2,2 &, requires_grad = True)   # defaults to false 
B = torch.randn (2,2 & ) 

Print (a.requires_grad, b.requires_grad)
 # used. requires_grad_ () to requires_grad set to true 
# is equivalent to = True b.requires_grad 
b.requires_grad_ ()
 Print (b.requires_grad)

2、

# Tensor two attributes 
# Grad: recording the Tensor corresponding gradient 
# grad_fn: pointing function object, record the Tensor been operating 

Print (A)
 Print (B) 
C = A + B
 Print (C)
 Print (C. requires_grad) 

Print (a.grad_fn)
 Print (b.grad_fn)
 Print (c.grad_fn) 

D = c.detach () # .detach () to obtain data similar to .data (), but the former is more secure, the latter does not modifies autograd tracking information 
Print (d)
 Print (d.requires_grad)

3、

# Calculation map 
X torch.randn = (. 1 ) 
W = torch.ones (. 1, requires_grad = True) 
B = torch.ones (. 1, requires_grad = True) 

Print (x.is_leaf, w.is_leaf, b.is_leaf) 

Y W * = X 
Z = Y + B
 Print (y.is_leaf, z.is_leaf)
 Print (y.grad_fn, z.grad_fn) 

# use .backward root node () can be obtained gradient leaf node 
z.backward (retain_graph = True )
 Print (w.grad)
 Print (b.grad)

 

Guess you like

Origin www.cnblogs.com/feihu-h/p/12305683.html