初始化

初始化

Different initializations lead to different resultsRandom initialization is used to break symmetry and make sure different hidden units can learn different thingsDon't intialize to values that are too largeHe initialization works well for networks with ReLU activations.

猜你喜欢

转载自blog.csdn.net/honk2012/article/details/80313245
今日推荐