pytorch中常用拼接方法stack与cat|(转载)

Stack

torch.stack()的官方解释,详解以及例子_模糊包的博客-CSDN博客_torch.stack()

Catenate

torch.cat()函数的官方解释,详解以及例子_模糊包的博客-CSDN博客_torch.cat


code(对二维数据进行拼接变成三维数据,参考来自Stack链接)

import torch
import torch_dct as dct
import numpy as np

batch_size = 8
channel_num = 7
sequence_length = 96

x=torch.rand(batch_size,channel_num,sequence_length)
list = []
for i in range(channel_num):#i represent channel ,分别对channel的数据做dct
    a=dct.dct(x[:,i,:]) 
    print("a-shape:",a.shape)
    b=x[:,i,:]*a #scale分别对i再进行 scale
    list.append(b)
    print("b-shape:",b.shape)
    
    #在把各自scale的channel结果合并
c=torch.stack((list[0],list[1],list[2],list[3],list[4],list[5],list[6]),dim=1)
c.shape

a-shape: torch.Size([8, 96])
b-shape: torch.Size([8, 96])
a-shape: torch.Size([8, 96])
b-shape: torch.Size([8, 96])
a-shape: torch.Size([8, 96])
b-shape: torch.Size([8, 96])
a-shape: torch.Size([8, 96])
b-shape: torch.Size([8, 96])
a-shape: torch.Size([8, 96])
b-shape: torch.Size([8, 96])
a-shape: torch.Size([8, 96])
b-shape: torch.Size([8, 96])
a-shape: torch.Size([8, 96])
b-shape: torch.Size([8, 96])
torch.Size([8, 7, 96])

猜你喜欢

转载自blog.csdn.net/weixin_43332715/article/details/127651445