Pytorch softmax计算与使用

softmax计算与使用:

import torch
import torch.nn.functional as F
x1= torch.Tensor([[1,2,3,4],
                  [1,3,4,5],
                  [3,4,5,6]]
                  )
y1= F.softmax(x, dim = 0) #对每一列进行softmax
y2 = F.softmax(x,dim =1) #对每一行进行softmax

结果为
y1:
tensor([
[0.1065, 0.0900, 0.0900, 0.0900],
[0.1065, 0.2447, 0.2447, 0.2447],
[0.7870, 0.6652, 0.6652, 0.6652]])
y2:
tensor([
[0.1065, 0.0900, 0.0900, 0.0900],
[0.1065, 0.2447, 0.2447, 0.2447],
[0.7870, 0.6652, 0.6652, 0.6652]])

pytorch里面的计算过程为:

Softmax ( x i ) = e x p ( x i ) j e x p ( x j ) \text{Softmax}(x_{i}) = \frac{exp(x_i)}{\sum_j exp(x_j)}

发布了9 篇原创文章 · 获赞 0 · 访问量 165

猜你喜欢

转载自blog.csdn.net/weixin_37532614/article/details/104636996