激活函数--activation function

https://en.wikipedia.org/wiki/Rectifier_(neural_networks)

在人工神经网络,rectifier是一个激活函数,定义如下:

rectified linear unit (ReLU):

他的平滑模式是:

softplus function:

他的导数是:

也就是逻辑函数,逻辑斯蒂函数logistic function

Noisy ReLUs:在restricted Boltzmann machines有好的应用

Leaky ReLUs:Leaky ReLUs allow a small, positive gradient when the unit is not active

Parametric ReLUs:若0.01为参数,则是参数形式的ReLUs。

如果a<=1,则

ELUs:Exponential linear units try to make the mean activations closer to zero which speeds up learning. It has been shown that ELUs can obtain higher classification accuracy than ReLUs





猜你喜欢

转载自blog.csdn.net/qq_27009517/article/details/80469295
今日推荐