02 operation and the amount of linear regression

First, the tensor operations

1.1 the amount of stitching and segmentation

1.1.1 torch.cat()

torch.cat(tensors, dim=0, out=None)

Function: tensor spliced ​​by dimension dim

  • tensors: tensor sequence
  • dim: To splicing dimension
t = torch.ones((2, 3))

t_0 = torch.cat([t, t], dim=0)
t_1 = torch.cat([t, t, t], dim=1)

print("t_0:{} shape:{}\nt_1:{} shape:{}".format(t_0, t_0.shape, t_1, t_1.shape))

Here Insert Picture Description

1.1.2 torch.stack()

torch.stack(tensors, dim=0, out=None)

Function: Dimension newly created spliced on dim

  • tensors: tensor sequence
  • dim: To splicing dimension
t = torch.ones((2, 3))

t_stack = torch.stack([t, t], dim=2)

print("\nt_stack:{} shape:{}".format(t_stack, t_stack.shape))

Here Insert Picture Description

note:

  • cat () will not expand tensor dimension, and stack () will
  • When the dim designated as 0, then the shift will have dimensions, such as the original (2, 3) t, t1 = torch.stack ([t, t], dim = 0), t1 dimension (2,2, 3)

1.1.3 torch.chunk()

torch.chunk(input, chunks, dim=0)

Function: tensor be cut by an average dimension dim points
Returns: tensor List
Note: If not divisible, the last one less than the other tensor tensor

  • input: to be sliced ​​tensor
  • chunks: To cut the number of copies of the minutes
  • dim: to be sliced ​​dimension
a = torch.ones((2, 7))  # 7
print(a)
list_of_tensors = torch.chunk(a, dim=1, chunks=3)  # 3

for idx, t in enumerate(list_of_tensors):
    print("第{}个张量:{}, shape is {}".format(idx + 1, t, t.shape))

Here Insert Picture Description

1.1.4 torch.split()

torch.split(tensor,split_size_or_sections, dim=0)

Function: The tensor dimension dim divided by cutting
Return Value: List tensor

  • tensor: to be sliced ​​tensor
  • split_size_or_sections: is int, each represents a length; is list, the list element by slicing
  • dim: to be sliced ​​dimension
t = torch.ones((2, 5))

# list_of_tensors = torch.split(t, 2, dim=1)
# for idx, t in enumerate(list_of_tensors):
#     print("第{}个张量:{}, shape is {}".format(idx + 1, t, t.shape))

list_of_tensors = torch.split(t, [2, 1, 2], dim=1)
for idx, t in enumerate(list_of_tensors):
    print("第{}个张量:{}, shape is {}".format(idx+1, t, t.shape))

Here Insert Picture Description

1.2 volume index

1.2.1 torch.index_select()

torch.index_select(input, dim, index, out=None)

Features: in the dimension dim, according to index index data
Returns: the index according to index data registration tensor

  • input: tensor to be indexed
  • dim: to index dimensions
  • index: The index number to the data

1.2.2 torch.masked_select()

torch.masked_select(input, mask, out=None)

Function: indexed by mask of True
Returns: a one-dimensional tensor

  • input: tensor to be indexed
  • Boolean tensor and input the same shape: mask

1.3 amount conversion

1.3.1 torch.reshape()

torch.reshape(input, shape)

Function: the transformation tensor shape
Caution: When tensor are continuous in memory, the new shared tensor input data memory

  • input: tensor to be transformed
  • shape: the new shape tensor

1.3.2 torch.transpose()

torch.transpose(input, dim0, dim1)

Function: exchange tensor in two dimensions

  • input: tensor to be transformed
  • dim0: to be exchanged dimensions
  • dim1: to be exchanged dimensions

1.3.3 torch.t()

torch.t(input)

Function: 2-dimensional tensor transpose matrix, the equivalent torch.transpose (input, 0, 1)

1.3.4 torch.squeeze()

torch.squeeze(input, dim=None, out=None)

Function: the dimension of a compressed length (axis)

  • dim: If it is None, remove all of the length of the shaft 1; if the specified dimension, and only when the axis length is 1, may be removed;

1.3.5 torch.unsqueeze()

torch.unsqueeze(input, dim, out=None)

Function: According to dim extension dimension

  • dim: Extended Dimensions

Second, the tensor math

Third, linear regression

Published 71 original articles · won praise 9 · views 5713

Guess you like

Origin blog.csdn.net/qq_36825778/article/details/104076859