【读书1】【2017】MATLAB与深度学习——多元分类(3)

假设神经网络在给定输入数据时产生如图4-11所示的输出。

Assume that the neural network produced theoutput shown in Figure 4-11 when given the input data.

在这里插入图片描述
图4-11 当使用sigmoid函数时的输出Output when using a sigmoidfunction

由于sigmoid函数只关心它自己本身的输出,所以每个节点的输出与其它节点无关。

As the sigmoid function concerns only itsown output, the output here will be generated.

输出节点1以100%的概率出现在类别1中。

The first output node appears to be inClass 1 by 100 percent probability.

那么,数据属于第1类吗?

Does the data belong to Class 1, then?

并不是那样的。

Not so fast.

其它输出节点的值也表示该数据在类别2和类别3中的概率为100%。

The other output nodes also indicate 100percent probability of being in Class 2 and Class 3.

因此,对多元分类神经网络输出值的充分解释需要考虑所有节点输出的相对大小。

Therefore, adequate interpretation of theoutput from the multiclass classification neural network requires considerationof the relative magnitudes of all node outputs.

在这个例子中,该数据属于每个类别的概率为1/3。

In this example, the actual probability ofbeing each class is 1/3.

softmax函数给出了正确的判断。

The softmax function provides the correctvalues.

softmax函数将输出值的总和保持为1,并且还将各输出值限制在0 – 1之间。

The softmax function maintains the sum ofthe output values to be one and also limits the individual outputs to be withinthe values of 0-1.

由于softmax考虑了所有输出值的相对大小,因此softmax函数是多元分类神经网络的一种合适选择。

As it accounts for the relative magnitudesof all the outputs, the softmax function is a suitable choice for themulticlass classification neural networks.
在这里插入图片描述

最后,确定学习规则。

Finally, the learning rule should bedetermined.

与二元分类网络一样,多元分类神经网络通常采用交叉熵驱动的学习规则。

The multiclass classification neuralnetwork usually employs the cross entropy-driven learning rules just like thebinary classification network does.

这是由于交叉熵函数提供的快速学习性能和简单特性。

This is due to the high learningperformance and simplicity that the cross entropy function provides.

长话短说,多元分类神经网络的学习规则与上一节二元分类神经网络的学习规则是一致的。

Long story short, the learning rule of themulticlass classification neural network is identical to that of the binaryclassification neural network of the previous section.

虽然这两种神经网络采用不同的激活函数(二元分类的sigmoid和多元分类的softmax),但是学习规则的推导结果相同。

Although these two neural networks employdifferent activation functions—the sigmoid for the binary and the softmax forthe multiclass—the derivation of the learning rule leads to the same result.

嗯,对于我们来说,需要记忆的内容越少越好。

Well, it is better for us to have less toremember.

下面总结了多元分类神经网络的训练过程。

The training process of the multiclassclassification neural network is summarized in these steps.

——本文译自Phil Kim所著的《Matlab Deep Learning》

更多精彩文章请关注微信号:在这里插入图片描述

猜你喜欢

转载自blog.csdn.net/weixin_42825609/article/details/83573333