The role of pytorch-Detach

The role of pytorch-Detach

The official description of detach() is as follows:

insert image description here

Suppose there are model A and model B, we need to use the output of A as the input of B, but we only train model B during training. Then we can do this:

input_B = output_A.detach()

It can disconnect the gradient transfer of the two computation graphs to achieve our desired functionality.

! ! ! Clear and clear! ! pytorch's function .detach()

.detach() in pytorch (with two links)

pytorch-view network parameters

Pytorch (6) (traversal of model parameters) - net.parameters() & net.named_parameters() & net.state_dict()

The difference between Batch Norm and Layer Norm

The difference between Batch Norm and Layer Norm

Guess you like

Origin blog.csdn.net/missgrass/article/details/125790119