[PyTorch] Getting started

[Use of automatic derivation]

For the required variable x, set require_grad == True

For the final function y, run backpropagation y.backward ()

Finally you can view dy / dx = x.grad

How to calculate the derivative of the middle quantity? ?

 

【High Dimensional Derivation】

Assuming that f is a high-dimensional function n to m, its derivation can be regarded as a jaco matrix nxm

You can enter an m-dimensional dz / dy for back propagation

y.backward (dz / dy)

In this case, x.grad returns dz / dx

Why can't you ask for guidance repeatedly? What is the principle inside?

 

【Neural Networks】

 

Guess you like

Origin www.cnblogs.com/yesuuu/p/12744716.html