Stanford CS20学习笔记(1)

Data Flow Graphs

(一)TF将 计算的定义执行分开.具体分为以下两个阶段:

  1. 创建graph;
  2. 使会话(session)执行图中的计算;

(二)what is Tensor?

An n-dimensional array
0-d tensor: scalar (number)
1-d tensor: vector
2-d tensor: matrix
and so on

(三) 数据流图的执行

数据流图解析与可视化

import tensorflow as tf
a = tf.add(3, 5)

TensorbBoard可视化得到下图:
TensorbBoard可视化

Why x, y?
TF automatically names the nodes when you don’t explicitly name them.

节点(Nodes): operators, variables, and constants
边缘(Edges): tensors

data flow graph 只运行需要的部分

1 code

import tensorflow as tf
x = 2
y = 3
add_op = tf.add(x, y)
mul_op = tf.multiply(x, y)
useless = tf.multiply(x, add_op)
pow_op = tf.pow(add_op, mul_op)
with tf.Session() as sess:
    z = sess.run(pow_op)

2 TensorBoard可视化
上面流图的可视化结果

3 subgraph的存在
(1) 为了节省计算, graph每次运行, 只运行与sess.run(op)op相关的subgraphs, 对于上面的code我们只想要 pow_op的值, 且pow_op并不依赖于useless. 因此, session并不会计算useless
(2) 可以将一个graph划分为多个subgraph, 这样可以进行并行/分布式运算.

4 实际上可以同时创建多个图,,但是很麻烦,属于高级特性,以后用到再说。

5 TF中graph的意义:

  1. Save computation. Only run subgraphs that lead to the values you want to fetch.
  2. Break computation into small, differential pieces to facilitate auto-differentiation
  3. Facilitate distributed computation, spread the work across multiple CPUs, GPUs, TPUs, or other devices
  4. Many common machine learning models are taught and visualized as directed graphs

猜你喜欢

转载自blog.csdn.net/qq_29007291/article/details/81128290