python softmax

softmax后,直接获取第几维的分数,就是预测值

    import torch
    import numpy as np
    import torch.nn.functional as torch_F

    data=np.array([[[0.5,-0.5],[-0.05,-0.05]]])
    t_data = torch.from_numpy(data.astype(np.float32))

   

    scores = torch_F.softmax(t_data, dim=-1)
    print(scores)
    scores = scores.squeeze(0).data.cpu().numpy()[:, 1]
    print(scores)

softmx 计算公式如下,值越大,分数越高,不是正比例关系,能避免求和或乘法-0.5 和0.5的符号带来的抵消关系。

两组值是不一样的:

import numpy as np
z = np.array([1.0, 2.0])
print(np.exp(z)/sum(np.exp(z)))

z = np.array([0.1, 0.2])
print(np.exp(z)/sum(np.exp(z)))
import math
z = [1.0, 2.0, 3.0, 4.0, 1.0, 2.0, 3.0]
z_exp = [math.exp(i) for i in z]  
print(z_exp)  # Result: [2.72, 7.39, 20.09, 54.6, 2.72, 7.39, 20.09] 
sum_z_exp = sum(z_exp)  
print(sum_z_exp)  # Result: 114.98 
softmax = [round(i / sum_z_exp, 3) for i in z_exp]
print(softmax)  # Result: [0.024, 0.064, 0.175, 0.475, 0.024, 0.064, 0.175]
[2.718281828459045, 7.38905609893065, 20.085536923187668, 54.598150033144236, 2.718281828459045, 7.38905609893065, 20.085536923187668]
114.98389973429897
[0.024, 0.064, 0.175, 0.475, 0.024, 0.064, 0.175]

Python使用numpy计算的示例代码:

import numpy as np
z = np.array([1.0, 2.0, 3.0, 4.0, 1.0, 2.0, 3.0])
print(np.exp(z)/sum(np.exp(z)))
发布了2718 篇原创文章 · 获赞 1004 · 访问量 536万+

猜你喜欢

转载自blog.csdn.net/jacke121/article/details/104666276
今日推荐