版权声明:未经博主允许,不得转载! https://blog.csdn.net/u011681952/article/details/86157597
Softmax Layer作用是将分类网络结果概率统计化,常常出现在全连接层后面
CNN分类网络中,一般来说全连接输出已经可以结束了,但是全连接层的输出的数字,有大有小有正有负,人看懂不说,关键是训练时,它无法与groundtruth对应(不在同一量级上),所以用Softmax Layer将其概率统计化,将输出归一化为和为1的概率值;这样我们能一眼看懂,关键是SoftmaxWithLossLayer也可以计算loss值
首先我们先看一下 SoftmaxParameter
// Message that stores parameters used by SoftmaxLayer, SoftmaxWithLossLayer
message SoftmaxParameter {
enum Engine {
DEFAULT = 0;
CAFFE = 1;
CUDNN = 2;
}
optional Engine engine = 1 [default = DEFAULT];
// The axis along which to perform the softmax -- may be negative to index
// from the end (e.g., -1 for the last axis).
// Any other axes will be evaluated as independent softmaxes.
optional int32 axis = 2 [default = 1];
}
Softmax Layer在prototxt里面的书写:
layers {
name: "prob"
type: “Softmax"
bottom: " fc"
top: "prob"
}
SoftmaxWithLossLayer在prototxt里面的书写:
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "prob"
bottom: "label"
top: "loss"
}