tensorflow overview

1 What is tensorflow

It is an open source software library for numerical calculations using data flow graphs. It was released in November 2015.

2 Why choose tensorflow

2.1 Flexibility and stability

It was originally developed by Google as a basic platform framework for academic research and product development of machine learning.

2.2 Popularity

Compared to existing machine learning frameworks, such as Keras, Caffe, PyTorch, Torch, Theano, etc., the Stars and Repositories on githup are the highest. Many companies, such as google, openai, intel, nvidia, etc. are using tensorflow. At the same time, the online documentation on tensorflow is also the most comprehensive.

3 Understand Graphs and Sessions

3.1 Data flow graphs

First, we define a graph, and then perform the operations in the graph through a session (the latest eager model will not need such a formal definition), as shown in the following figure:
Insert picture description here

3.2 What is tensor

Simply put, tensor is an n-dimensional array. In tensorflow, the corresponding forms of different dimensions are as follows:

  • 0 dimension tensor: corresponds to a scalar number value (such as a number)
  • 1 dimension tensor: corresponds to a vector vector
  • 2-dimensional tensor: corresponds to a matrix matrix
import tensorflow as tf
a = tf.add(3,5)

To do a simple addition as above, use tensorboad to visualize the graph as follows:
Insert picture description here
when you do not give the name of the node explicitly, tensorflow will automatically give the node a name, so the above figure corresponds to: x = 3, y = 5 x=3 , y=5x=3,Y=
There are three types of Nodes in Figure 5 :

  • Operation operators
  • Variables
  • Constants

Edges only represents one kind:

  • The
    final graph form of tensors is as follows:
    Insert picture description here
    so why is this named tensorflow, because:
  • tensors is data
  • tensorflow = tensor + flow = data + flow

Next, how do we get the added aa abovea value it?

Create a session, the session will calculate all the nodes in the graph pointing to node a, as shown in the following figure: The
Insert picture description here
calculation code is as follows:

import tensorflow as tf
a = tf.add(3,5)
with tf.Session() as sess:
	print(sess.run(a))

3.3 tf.Session()

A session object is used to encapsulate the environment. In this environment, the operation object is executed and the tensor object is calculated. The session also allocates memory resources for the current variable value. For the various operations defined in the figure, when we run, the operation involved in the corresponding value will be operated, and the operation that is not involved will not be calculated in the session. As shown in the figure below: the
Insert picture description here
calculation code is as follows:

x = 2
y = 3
add_op = tf.add(x,y)
mul_op = tf.multiply(x, y)
useless = tf.multiply(x, add_op)
pow_op = tf.pow(add_op, mul_op)
with tf.Session() as sess:
	z = sess.run(pow_op)

If we only want the value of pow_op, the session will not calculate the value of useless.
If you need the value of pow_op, you only need to fetch a series of tensors, just change it in the last line:

z, not_useless = sess.run([pow_op, useless])

3.4 Why choose graphs

  • Save calculation overhead , only need to execute the operations of the value you want to obtain, and irrelevant operations do not need to be calculated
  • Automatic differentiation is more flexible . The calculation can be divided into small and different parts, so that automatic differentiation calculation can be performed more flexibly
  • Distributed computing , work can be distributed on multiple CPUs, GPUs or TPUs for calculations
  • Graph structure model , many common machine learning models are regarded as directional graphs

Guess you like

Origin blog.csdn.net/BGoodHabit/article/details/108723842