batchnorm与dropout的区别

Dropout is mostly a technique for regularization. It introduces noise into a neural network to force the neural network to learn to generalize well enough to deal with noise. (This is a big oversimplification, and dropout is really about a lot more than just robustness to noise)


Batch normalization is mostly a technique for improving optimization.


As a side effect, batch normalization happens to introduce some noise into the network, so it can regularize the model a little bit.


When you have a large dataset, it’s important to optimize well, and not as important to regularize well, so batch normalization is more important for large datasets. You can of course use both batch normalization and dropout at the same time

猜你喜欢

转载自blog.csdn.net/ningyanggege/article/details/80771079