Pytorch's permute function

1. Main function: transform tensor dimension

example:

import torch
x = torch.randn(2, 3, 5)
print(x.size())
print(x.permute(2, 0, 1).size())

>>>torch.Size([2, 3, 5])
>>>torch.Size([5, 2, 3])

2. Introduce the similarities and differences between transpose and permute:

Same: all transpose the tensor dimension;

Difference: permute function can transpose any high-dimensional matrix, but there is no torch.permute () call method

torch.randn(2,3,4,5).permute(3,2,0,1).shape

>>>torch.Size([5, 4, 2, 3])

Transpose can only operate the transposition of 2D matrix, can not operate more than 2 dimensions, so if you want to achieve transposition of multiple dimensions, you can use a one-time

permute, you can also use transpose multiple times;

torch.randn(2,3,4,5).transpose(3,0).transpose(2,1).transpose(3,2).shape 
>>>torch.Size([5, 4, 2, 3])

3. Association of permute function with contiguous and view functions

contiguous: view can only act on contiguous variables, if you call transpose, permute, etc. before the view, you need to call

contiguous () to return a contiguous copy;

That is to say, operations such as transpose and permute will make the tensor become discontinuous in memory, so if you want to view, you must make the tensor continuous;

The explanation is as follows: Some tensors do not occupy a whole block of memory, but are composed of different data blocks, and the view () operation of tensor depends on the memory to be a whole block. At this time, only the contiguous () function needs to be executed to put the tensor Into a form of continuous distribution in memory;

To determine whether ternsor is contiguous, you can call the torch.Tensor.is_contiguous () function:

import torch 
x = torch.ones(10, 10) 
x.is_contiguous()                                 # True 
x.transpose(0, 1).is_contiguous()                 # False
x.transpose(0, 1).contiguous().is_contiguous()    # True

In addition: in the latest version 0.4 of pytorch, torch.reshape () is added, which is similar to numpy.reshape () in function, roughly equivalent to tensor.contiguous (). View (), which saves the tensor The trouble of calling contiguous () before view () transformation;

3. Permute and view functions

import torch
import numpy as np

a=np.array([[[1,2,3],[4,5,6]]])
unpermuted=torch.tensor(a)
print(unpermuted.size())              #  ——>  torch.Size([1, 2, 3])

permuted=unpermuted.permute(2,0,1)
print(permuted.size())                #  ——>  torch.Size([3, 1, 2])

view_test = unpermuted.view(1,3,2)
print(view_test.size())   

>>>torch.Size([1, 2, 3])
torch.Size([3, 1, 2])
torch.Size([1, 3, 2])

 

Published 943 original articles · Like 136 · Visit 330,000+

Guess you like

Origin blog.csdn.net/weixin_36670529/article/details/105226803