[Python-torch] The sum of the corresponding positions of the same elements of two tensors

[Python-torch] The weighted sum of the corresponding positions of the same elements of two tensors: a very simple requirement: the element in tensor A is the label corresponding to the element in tensor b, and I want to add the elements in b with the same label to the elements under this label and

1. Demand

  • A is lable, B is a tensor,

A very simple requirement: the element in tensor A is the label corresponding to the element in tensor b, and you want to divide the elements in b with the same label by the sum of the elements under this label.

  • This requirement is actually used in many places, the most used is the graph attention network attention, to find the attention factor
  • I searched many places online, but couldn't find it.
  • Finally found.

2. Need to install the package

3. Solve the code

import torch
from torch_scatter import scatter_sum

A = torch.tensor([1, 2, 0, 4, 2])

B = torch.tensor([2, 4, 6, 8, 10], dtype=torch.float32)

# 将 B 中的元素按照标签 A 进行求和
sum_result = scatter_sum(B, A, dim=0)

print(sum_result)

# 创建一个新的张量,用于存储除以元素和后的结果
C = B / sum_result[A]

print(C)

Guess you like

Origin blog.csdn.net/qq_51392112/article/details/132101735
Recommended