tensorflow
The visualizations are done using summary
and tensorboard
collaboratively.
Basic usage
First, let's be clear, summary
yes op
.
output network structure
with tf.Session() as sess:
writer = tf.summary.FileWriter(your_dir, sess.graph)
Run the command line tensorboard --logdir your_dir
, and then enter the browser 127.0.1.1:6006
Note: The tensorboard port of the tf1.1.0 version has been changed (0.0.0.0:6006)
so that you can tensorboard
see your network structure diagram in
Visualize parameters
#ops
loss = ...
tf.summary.scalar("loss", loss)
merged_summary = tf.summary.merge_all()
init = tf.global_variable_initializer()
with tf.Session() as sess:
writer = tf.summary.FileWriter(your_dir, sess.graph)
sess.run(init)
for i in xrange(100):
_,summary = sess.run([train_op,merged_summary], feed_dict)
writer.add_summary(summary, i)
At this time, open tensorboard
it, and EVENTS
you can see loss
the i
changes with it. If you can't see it, you can writer.flush()
try it at the end of the code, and the reason will be explained later.
Function introduction
-
tf.summary.merge_all
summary op
: combine all previously defined -
FileWriter
: Create afile writer
file to writesummary
data to the hard disk, -
tf.summary.scalar(summary_tags, Tensor/variable, collections=None)
: for scalarsummary
-
tf.summary.image(tag, tensor, max_images=3, collections=None, name=None)
:tensor, must be 4-dimensional, shape [batch_size, height, width, channels],max_images
(can only generate 3 pictures at most ), I think this visualizationsummary
used in convolution is very useful. It is determined that the generated picture is [ -max_images: ,height, width, channels], another point is that the one you see in is always the last onekernel
max_images
TensorBord
image summary
global step
-
tf.summary.histogram(tag, values, collections=None, name=None)
:values, arbitrary shapetensor
, generate a histogramsummary
-
tf.summary.audio(tag, tensor, sample_rate, max_outputs=3, collections=None, name=None)
Explain the collections parameter: it is a list
, if collections is not specified, then the summary will be added to f.GraphKeys.SUMMARIES
, if specified, it will be placed collections
in .
FileWriter
Note: add_summary
It is only FileWriter
stored in the object's cache event data
. And writing data disk
up is FileWrite对象
controlled by. FileWriter
This is covered by the constructor below ! ! !
tf.summary.FileWriter.__init__(logdir, graph= None, max_queue= 10, flush_secs= 120, graph_def= None)
Creates a FileWriter and an event file.
# max_queue: The maximum number of events that can be cached before writing data to disk Number
# flush_secs: how many seconds to write data to disk and empty the object cache
Notice
-
If
writer.add_summary(summary,global_step)
noglobal_step
parameters are passed when using it, it willscarlar_summary
turn into a straight line. -
As long as it is on the calculation graph
Summary op
, it willmerge_all
be captured, and there is no need to consider the variable life cycle! - If you execute it once and the data
disk
is not savedSummary
, you can try the nextfile_writer.flush()
Tips
If you want to generate a summary with layers, remember to summary
add one outsidename_scope
with tf.name_scope("summary_gradients"):
tf.summary.histgram("name", gradients)
In this way, tensorboard
when displayed, there will be a first- sumary_gradients
level directory