Summary
tf.name_scope
And tf.variable_scope
in the Tensorflow
inside are operating on the management of variables, literal understanding is that specified a different scope of the variables: Name Domain (name_scope) and the variable domain (variable_scope). Both corresponding create / call function are variable tf.variable()
and tf.get_variable()
, while on the two different use also determines tf.name_scope()
and tf.variable_scope()
different roles.
Main functions:
- Shared variables: is
tf.variable_scope
thetf.get_variable()
combination of the use of specific. - Variables command to facilitate management: both of which can, but the two are different.
- To draw a flowchart tensorboard package for visualization variables: both of which may
Particularly described below:
tf.name_scope() 与 tf.variable()
tf.name_scope mainly with tf.Variable () to use, convenient parameter name management.
In tf.Variable
each time checking whether duplicate variable names, if repeated will be processed to create a new variable, it is not shared variable functions, to achieve the variable name is automatically modified.
When tf.name_scope the same time, the variable names created from name_scope impact tf.get_variable (), but also in the shared variable is not specified, if the same name error.
import tensorflow as tf
with tf.name_scope('name_scope_x'):
var1 = tf.get_variable(name='var1', shape=[1], dtype=tf.float32)
var3 = tf.Variable(name='var2', initial_value=[2], dtype=tf.float32)
var4 = tf.Variable(name='var2', initial_value=[2], dtype=tf.float32)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print(var1.name, sess.run(var1))
print(var3.name, sess.run(var3))
print(var4.name, sess.run(var4))
# 输出结果:
# var1:0 [-0.30036557] 可以看到前面不含有指定的'name_scope_x'
# name_scope_x/var2:0 [ 2.]
# name_scope_x/var2_1:0 [ 2.] 可以看到变量名自行变成了'var2_1',避免了和'var2'冲突
tf.variable_scope 与 tf.get_variable()
tf.variable_scope()
Mainly with tf.get_variable()
use, by resuse
the variable parameter settings to achieve sharing.
import tensorflow as tf
with tf.variable_scope('v_scope') as scope1:
Weights1 = tf.get_variable('Weights', shape=[2,3])
bias1 = tf.get_variable('bias', shape=[3])
# 下面来共享上面已经定义好的变量
# note: 在下面的 scope 中的变量必须已经定义过了,才能设置 reuse=True,否则会报错
with tf.variable_scope('v_scope', reuse=True) as scope2:
Weights2 = tf.get_variable('Weights')
print(Weights1.name)
print(Weights2.name)
# 可以看到这两个引用名称指向的是同一个内存对象
#v_scope/Weights:0
#v_scope/Weights:0
It can also be combined tf.Variable () a use, but it does not achieve the shared variables:
import tensorflow as tf
# 注意, bias1 的定义方式
with tf.variable_scope('v_scope') as scope1:
Weights1 = tf.get_variable('Weights', shape=[2,3])
bias1 = tf.Variable([0.52], name='bias')
print Weights1.name
print bias1.name
# 下面来共享上面已经定义好的变量
# note: 在下面的 scope 中的get_variable()变量必须已经定义过了,才能设置 reuse=True,否则会报错
with tf.variable_scope('v_scope', reuse=True) as scope2:
Weights2 = tf.get_variable('Weights')
bias2 = tf.get_variable('bias', [1]) # ‘bias
print Weights2.name
print bias2.name
# 这样子的话就会报错
# Variable v_scope/bias does not exist, or was not created with tf.get_variable()
postscript
Tensorflow by modifying a model to solve a problem on variable initialization, where it also recorded recently:
I want to achieve with variable tensorlow randomly generated data in a subroutine and then passed inside the main function, however, the session before running the main function also initialized, but still reported an uninitialized wrong ... but later found not to write posture, as detailed in the following code:
import tensorflow as tf
def function():
with tf.variable_scope('test'):
v = tf.get_variable("v",[1],initializer=tf.constant_initializer(1.0))
return v
if __name__ == '__main__':
with tf.Session() as sess:
v = function()
tf.global_variables_initializer().run()
print(sess.run(v))
#print(sess.run(function())) 直接run这个函数返回值并不能完成初始化操作...这句话报错让我纠结一下午