batch normalize、relu、dropout 等的相对顺序

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/B08370108/article/details/83783008

batch norm、relu、dropout 等的相对顺序
Ordering of batch normalization and dropout in TensorFlow?

在 Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift 一文中,作者指出,“we would like to ensure that for any parameter values, the network always produces activations with the desired distribution”(produces activations with the desired distribution,为激活层提供期望的分布)。

因此 Batch Normalization 层恰恰插入在 Conv 层或全连接层之后,而在 ReLU等激活层之前。而对于 dropout 则应当置于 activation layer 之后。

引用了其他博客

猜你喜欢

转载自blog.csdn.net/B08370108/article/details/83783008