tensorflow API _ 5 (tensorflow.summary)

tensorflowThe visualizations are done using summaryand tensorboardcollaboratively.

Basic usage

First, let's be clear, summaryyes op.

output network structure

with tf.Session() as sess:
  writer = tf.summary.FileWriter(your_dir, sess.graph)

Run the command line tensorboard --logdir your_dir, and then enter the browser 127.0.1.1:6006Note: The tensorboard port of the tf1.1.0 version has been changed (0.0.0.0:6006) 
so that you can tensorboardsee your network structure diagram in

Visualize parameters

#ops
loss = ...
tf.summary.scalar("loss", loss)
merged_summary = tf.summary.merge_all()

init = tf.global_variable_initializer()
with tf.Session() as sess:
  writer = tf.summary.FileWriter(your_dir, sess.graph)
  sess.run(init)
  for i in xrange(100):
    _,summary = sess.run([train_op,merged_summary], feed_dict)
    writer.add_summary(summary, i)

At this time, open tensorboardit, and EVENTSyou can see lossthe ichanges with it. If you can't see it, you can writer.flush()try it at the end of the code, and the reason will be explained later.

Function introduction

  • tf.summary.merge_allsummary op: combine all previously defined

  • FileWriter: Create a file writerfile to write summarydata to the hard disk,

  • tf.summary.scalar(summary_tags, Tensor/variable, collections=None): for scalar summary

  • tf.summary.image(tag, tensor, max_images=3, collections=None, name=None):tensor, must be 4-dimensional, shape [batch_size, height, width, channels], max_images(can only generate 3 pictures at most ), I think this visualization summaryused in convolution is very useful. It is determined that the generated picture is [ -max_images: ,height, width, channels], another point is that the one you see in is always the last onekernelmax_imagesTensorBordimage summaryglobal step

  • tf.summary.histogram(tag, values, collections=None, name=None):values, arbitrary shape tensor, generate a histogramsummary

  • tf.summary.audio(tag, tensor, sample_rate, max_outputs=3, collections=None, name=None)

Explain the collections parameter: it is a list, if collections is not specified, then the summary will be added to f.GraphKeys.SUMMARIES, if specified, it will be placed collectionsin .

 

FileWriter

Note: add_summaryIt is only FileWriterstored in the object's cache event data. And writing data diskup is FileWrite对象controlled by. FileWriterThis is covered by the constructor below ! ! !

tf.summary.FileWriter.__init__(logdir, graph= None, max_queue= 10, flush_secs= 120, graph_def= None)

Creates a FileWriter and an event file.
# max_queue: The maximum number of events that can be cached before writing data to disk Number
# flush_secs: how many seconds to write data to disk and empty the object cache

Notice

  1. If writer.add_summary(summary,global_step)no global_stepparameters are passed when using it, it will scarlar_summaryturn into a straight line.

  2. As long as it is on the calculation graph Summary op, it will merge_allbe captured, and there is no need to consider the variable life cycle!

  3. If you execute it once and the data diskis not saved Summary, you can try the nextfile_writer.flush()

 

Tips

If you want to generate a summary with layers, remember to summaryadd one outsidename_scope

with tf.name_scope("summary_gradients"):
    tf.summary.histgram("name", gradients)

In this way, tensorboardwhen displayed, there will be a first- sumary_gradientslevel directory

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324741412&siteId=291194637