警告UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach

This warning message is a reminder that when constructing a new tensor, it is recommended to use  sourceTensor.clone().detach() or  sourceTensor.clone().detach().requires_grad_(True)instead of using  torch.tensor(sourceTensor) .

The warning message mentions this recommendation because in PyTorch, torch.tensor() functions create new tensors and do not share memory with the original tensor, even if the input is an existing PyTorch tensor. If the original tensor is part of a backpropagation calculation graph, using  torch.tensor() the function will destroy the historical information of the tensor in the calculation graph, so that the original tensor and the newly created tensor no longer share gradients and calculation history. Therefore, if you want to create a new tensor, it is better to use  tensor.clone().detach() or  tensor.detach().clone(), which can safely copy a tensor with a shared computation history.

Specifically, if you want to create a new tensor a from it  band use this new tensor as part of the current calculation graph, you should use the following method:

b = a.clone().detach() # 或者 b = a.detach().clone()

replace

b = torch.tensor(a)

The resulting tensor  b has the same shape and values, but it no longer shares gradients and computation history and can be safely used in other operations.

Guess you like

Origin blog.csdn.net/djdjdhch/article/details/130628522