Tensor in pytorch commonly used is_contiguous meaning

is_contiguous

According to the name, you can know whether it is continuous and adjacent. In pytorch, the bottom layer of tensors of any dimension is a one-dimensional tensor, but it depends on how you read it. Therefore, the scalars in each tensor are continuous. If we transpose the matrix, the tensor scalars will be discontinuous, and calling is_contiguous will be False, and False cannot call view to change the tensor dimension, only reshape can be used.

code demo

import torch

a = [[1,2,3], [4,5,6], [7,8,9]]
a = torch.tensor(a)

'''
tensor([[1, 2, 3],
        [4, 5, 6],
        [7, 8, 9]])
'''
a = a.t()
'''
tensor([[1, 4, 7],
        [2, 5, 8],
        [3, 6, 9]])
'''


print(a.is_contiguous())
# False
a = a.contiguous()
print(a.is_contiguous())
# True

Guess you like

Origin blog.csdn.net/weixin_45074568/article/details/126748659