tensorflow 获取变量&打印权值等方法

tensorflow 获取变量&打印权值等方法


在使用tensorflow中,我们常常需要获取某个变量的值,比如:打印某一层的权重,通常我们可以直接利用变量的name属性
来获取,但是当我们利用一些第三方的库来构造神经网络的layer时,存在一种情况:就是我们自己无法定义该层的变量,因为
是自动进行定义的。比如用tensorflow的slim库时:

def resnet_stack(images, output_shape, hparams, scope=None):
  """Create a resnet style transfer block.

  Args:
    images: [batch-size, height, width, channels] image tensor to feed as input
    output_shape: output image shape in form [height, width, channels]
    hparams: hparams objects
    scope: Variable scope

  Returns:
    Images after processing with resnet blocks.
  """
  end_points = {}
  if hparams.noise_channel:
    # separate the noise for visualization
    end_points['noise'] = images[:, :, :, -1]
  assert images.shape.as_list()[1:3] == output_shape[0:2]

  with tf.variable_scope(scope, 'resnet_style_transfer', [images]):
    with slim.arg_scope(
        [slim.conv2d],
        normalizer_fn=slim.batch_norm,
        kernel_size=[hparams.generator_kernel_size] * 2,
        stride=1):
      net = slim.conv2d(
          images,
          hparams.resnet_filters,
          normalizer_fn=None,
          activation_fn=tf.nn.relu)
      for block in range(hparams.resnet_blocks):
        net = resnet_block(net, hparams)
        end_points['resnet_block_{}'.format(block)] = net

      net = slim.conv2d(
          net,
          output_shape[-1],
          kernel_size=[1, 1],
          normalizer_fn=None,
          activation_fn=tf.nn.tanh,
          scope='conv_out')
      end_points['transferred_images'] = net
    return net, end_points

我们希望获取第一个卷积层的权重weight,该怎么办呢??

在训练时,这些可训练的变量会被tensorflow保存在 tf.trainable_variables() 中,于是我们就可以通过打印
tf.trainable_variables() 来获取该卷积层的名称(或者你也可以自己根据scope来看出来该变量的name ),
然后利用tf.get_default_grap().get_tensor_by_name 来获取该变量。举个简单的例子:

import tensorflow as tf
with tf.variable_scope("generate"):
    with tf.variable_scope("resnet_stack"):
        #简单起见,这里没有用第三方库来说明,
        bias = tf.Variable(0.0,name="bias")
        weight = tf.Variable(0.0,name="weight")

for tv in tf.trainable_variables():
    print (tv.name)

b = tf.get_default_graph().get_tensor_by_name("generate/resnet_stack/bias:0")
w = tf.get_default_graph().get_tensor_by_name("generate/resnet_stack/weight:0")

with tf.Session() as sess:
    tf.global_variables_initializer().run()
    print(sess.run(b))
    print(sess.run(w))

结果如下:




猜你喜欢

转载自blog.csdn.net/cassiepython/article/details/79179044
今日推荐