Data Flow Graphs
(一)TF将 计算的定义
与执行
分开.具体分为以下两个阶段:
- 创建graph;
- 使会话(session)执行图中的计算;
(二)what is Tensor?
An n-dimensional array
0-d tensor: scalar (number)
1-d tensor: vector
2-d tensor: matrix
and so on
(三) 数据流图的执行
数据流图解析与可视化
import tensorflow as tf
a = tf.add(3, 5)
TensorbBoard可视化得到下图:
Why x, y?
TF automatically names the nodes when you don’t explicitly name them.
节点(Nodes): operators, variables, and constants
边缘(Edges): tensors
data flow graph 只运行需要的部分
1 code
import tensorflow as tf
x = 2
y = 3
add_op = tf.add(x, y)
mul_op = tf.multiply(x, y)
useless = tf.multiply(x, add_op)
pow_op = tf.pow(add_op, mul_op)
with tf.Session() as sess:
z = sess.run(pow_op)
2 TensorBoard可视化
3 subgraph的存在
(1) 为了节省计算, graph每次运行, 只运行与sess.run(op)
中op
相关的subgraphs, 对于上面的code我们只想要 pow_op
的值, 且pow_op
并不依赖于useless
. 因此, session并不会计算useless
(2) 可以将一个graph划分为多个subgraph, 这样可以进行并行/分布式运算.
4 实际上可以同时创建多个图,,但是很麻烦,属于高级特性,以后用到再说。
5 TF中graph的意义:
- Save computation. Only run subgraphs that lead to the values you want to fetch.
- Break computation into small, differential pieces to facilitate auto-differentiation
- Facilitate distributed computation, spread the work across multiple CPUs, GPUs, TPUs, or other devices
- Many common machine learning models are taught and visualized as directed graphs