3, automatic differentiation (derivation, gradient)

After some calculation to obtain x 1 y, then appeared dy / dx, and a mapping x → y (y-how obtained by the x-operation).

Obtaining dy / dx need two steps: y.backward (), x.grad, i.e. back-propagation, a gradient is obtained

x → y mapping grad_fn Tensor is a property of the object: y.grad_fn

Note that the back-propagation gradient will accumulate, so it should be cleared before the gradient back-propagation

t.ones = x (2,2 &, requires_grad = True) # tracking all arithmetic operations on the x 
y = x.sum () # 4. Note that only a scalar y 
y.grad_fn # y is obtained by a calculation of what , SumBackward0 AT 0x2598370d948 
# Y x differential pair, i.e., a gradient of Dy / DX 
y.backward () # backpropagation 
Print (x.grad) # output gradient 
# backpropagation gradient will accumulate, so it should be cleared before the backpropagation gradient 
y.backward () 
 Print (x.grad) # output gradient 
y.backward () 
 Print (x.grad) # output gradient 
# gradient cleared 
x.grad.zero_ () 
y.backward () # backpropagation 
Print (X .grad) # output gradient

 

 

Guess you like

Origin www.cnblogs.com/xixixing/p/12626875.html