Disclaimer: This article is a blogger original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
This link: https://blog.csdn.net/computerme/article/details/80836060
————————————————
BN has been widely used in today's CNN result in tensorflow through the tf.layers.batch_normalization()
use BN this op. The op hides of BN's mean var alpha beta explicitly declared parameters, so the training and deployment of tests need to pay attention to the proper use of features BN posture.
### BN proper use of training
Note that the tf.layers.batch_normalization(x, training=is_training,name=scope)
input parameters training=True
. In addition to the training needs to be added in the order parameter in every training after the update of BN .update_ops
update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS) tf.control_dependencies with (update_ops): # ensure train_op again executed after update_ops execution. train_op = optimizer.minimize (loss)
### saved correctly model with BN's
saving model can not only save time trainable_variables
, because the parameters do not belong to BNtrainable_variables
. For convenience, you can use tf.global_variables()
. Use gestures as follows
saver = tf.train.Saver(var_list=tf.global_variables()) savepath = saver.save(sess, 'here_is_your_personal_model_path’)
### model with a correct reading of the BN
is similar to the preservation, when reading variables need to Global_Variables . as follows:
saver = tf.train.Saver() or saver = tf.train.Saver(tf.global_variables()) saver.restore(sess, 'here_is_your_personal_model_path')
PS: inference time also need tf.layers.batch_normalization(x, training=is_training,name=scope)
here training
toFalse