深度学习(框架)默认值列表

deeplearning-default-value

Functions

Object names in Chainer Parameter names in Chainer Chainer PyTorch TensorFlow
dropout ratio 0.5 (drop ratio) 0.5 (drop ratio) Required (keep ratio)

Links

Object names in Chainer Parameter names in Chainer Chainer PyTorch TensorFlow
Link initilizer LecunNormal Each link has a different initializer. GlorotUniform

Initializers

  • PyTorch initializers are defined in each layer implementation, so skip it.
Object names in Chainer Parameter names in Chainer Chainer TensorFlow
GlorotUniform scale 1.0 No args (1.0)
LecunNormal scale 1.0 No args (1.0)
Normal scale 0.05 1.0 or 0.05
Uniform scale 0.05 min, max = (0.0, 1.0) or (-0.05, 0.05)

Optimizers

Object names in Chainer Parameter names in Chainer Chainer PyTorch TensorFlow
SGD lr 0.01 Required 0.01
Adam alpha 0.001 0.001 0.001
beta1 0.9 0.9 0.9
beta2 0.999 0.999 0.999
eps 1e-08 1e-08 1e-08
eta 1.0 No args No args
weight_decay 0.0 0.0 No args
amsgrad False False No args

Normalization methods

Object names in Chainer Parameter names in Chainer Chainer PyTorch TensorFlow
BatchNormalization eps 2e-05 1e-05 1e-03
LayerNormalization eps 1e-06 1e-05 No args (1e-12)
GroupNormalization eps 1e-05 1e-05 1e-06
groups Required Required 32
local_response_normalization n 5 Required 5
k 2 1 1
alpha 1e-04 1e-04 1
beta 0.75 0.75 0.5

Rules

Kinds of symbols

  • None: default value is None.
  • x: No implementations.
  • No args: Magic number. We can not specify the value.
  • Required: No default value. We should specify the value.

转载自:https://github.com/keisuke-umezawa/deeplearning-default-value

猜你喜欢

转载自blog.csdn.net/shanglianlm/article/details/85083351