交叉熵损失CrossEntropyLoss

交叉熵损失CrossEntropyLoss

语义分割网络输出tensor的尺寸为【B,C,H,W】,进行多分类,label的尺寸为【B,H,W】。
举例:三分类:output【1, 3,3, 3】,label【1, 3, 3】

验证

import torch

output = torch.tensor([[[[1, 1, 0],
                         [2, 0, 1],
                         [2, 1, 0]],
                        [[2, 0, 2],
                         [0, 0, 1],
                         [1, 1, 2]],
                        [[1, 1, 0],
                         [0, 0, 1],
                         [2, 0, 2]]]
                       ]).float()
label = torch.tensor([[[1, 1, 0],
                       [1, 0, 0],
                       [2, 2, 2]]]).long()

CrossEntropyLoss = torch.nn.CrossEntropyLoss(reduction='none')
loss = CrossEntropyLoss(output, label)
print(loss)
# result:
# tensor([[[0.5514, 1.8620, 2.2395],
#          [2.2395, 1.0986, 1.0986],
#          [0.8620, 1.8620, 0.7586]]])
import torch

output = torch.tensor([[[[1, 1, 0],
                         [2, 0, 1],
                         [2, 1, 0]],
                        [[2, 0, 2],
                         [0, 0, 1],
                         [1, 1, 2]],
                        [[1, 1, 0],
                         [0, 0, 1],
                         [2, 0, 2]]]
                       ]).float()
label = torch.tensor([[[1, 1, 0],
                       [1, 0, 0],
                       [2, 2, 2]]]).long()

CrossEntropyLoss = torch.nn.CrossEntropyLoss(reduction='mean')
loss = CrossEntropyLoss(output, label)
print(loss)
# result:
# tensor(1.3969)

猜你喜欢

转载自blog.csdn.net/wagnbo/article/details/131374551
今日推荐