The graph in tensorflow contains some operation objects, which are computing nodes. Tensors represent data nodes between different operations.
tensorflow will create a default graph, which can be accessed through tf.get_default_graph():
Code:
import tensorflow as tf
import numpy as np
c=tf.constant(value=1)
#print(assert c.graph is tf.get_default_graph())
print(c.graph)
print(tf.get_default_graph())
result:
How to use a custom graph, using the context manager of Graph.as_default():
Code:
import tensorflow as tf
import numpy as np
c=tf.constant(value=1)
#print(assert c.graph is tf.get_default_graph())
print(c.graph)
print(tf.get_default_graph())
g=tf.Graph()
print("g:",g)
with g.as_default():
d=tf.constant(value=2)
print(d.graph)
#print(g)
g2=tf.Graph()
print("g2:",g2)
g2.as_default()
e=tf.constant(value=15)
print(e.graph)
result:
Analysis: The first one has a total of one graph, which is created by c by default; the second case: the graph g is first created, and then the variable is declared under g, so g overrides the default graph when d is declared. The third case is that there are two graphs.