Tensorflow学习笔记(2):tf.nn.dropout 与 tf.layers.dropout

A quick glance through tensorflow/python/layers/core.py and tensorflow/python/ops/nn_ops.pyreveals that tf.layers.dropout is a wrapper for tf.nn.dropout.

You want to use the dropout() function in tensorflow.contrib.layers, not the one in tensorflow.nn.

The only differences in the two functions are:

  1. The tf.nn.dropout has parameter keep_prob: "Probability that each element is kept"
    tf.layers.dropout has parameter rate: "The dropout rate"
    Thus, keep_prob = 1 - rate as defined here
  2. The tf.layers.dropout has training parameter: "Whether to return the output in training mode (apply dropout) or in inference mode (return the input untouched)."   The first one turns off(no-op) when not training, which is what you want, while the sec‐ond one does not.



猜你喜欢

转载自blog.csdn.net/zwqjoy/article/details/79282957