tf.reduce_sum () function

. 1
2
. 3
. 4
. 5
. 6
. 7
reduce_sum tensor summation internal tool. Its parameters:

input_tensor is required and the tensor

axis is required and the rank, if none, then all should rank hatred and

Do you want to sum dimension reduction after keep_dims

The name of the operation, may be used in the graph

It has been eliminated, to be replaced parameter axis

tf.constant = X ([[. 1,. 1,. 1], [. 1,. 1,. 1]])
tf.reduce_sum (X, 0) # 0 of the summing tensor, [1,1,1] + [1,1,1] = [2, 2, 2]
tf.reduce_sum (X, 1) on the stage # 1 performs Tensor Qiu, [1 + 1 + 1, 1 + 1 + 1] = [3, . 3]
tf.reduce_sum (X, 1, keep_dims = True) # 1 of the first summing stage, but not dimensionality reduction, [[. 3], [. 3]]
tf.reduce_sum (X, [0, 1]) # level 0 and 1 and requires 6
tf.reduce_sum (x) # x as level 2 only, so the result above a 6
1
2
. 3
. 4
. 5
. 6
are the function TensorFlow official document explained.

In fact, reduce_sum (), the consideration is up from the dimension (feel this notion in Matlab data more like)


When calling reduce_sum (arg1, arg2), and the parameter is a requirement for data arg1, arg2 has two values ​​0 and 1, respectively, usually reduction_indices = [0] or reduction_indices = [1] to pass parameters. As can be seen from the figure, when arg2 = 0, sum matrix is ​​a longitudinal, a few columns of the original matrix to obtain several values; similarly, when arg2 = 1, is a transverse summing matrix; omitted when arg2 parameter, the default for all elements of the matrix are summed.

See here, why prefix the function name is reduce_ in fact it is very easy to understand, reduce what is meant by "dimension reduction matrix" of dimension reduction is part of the way back underscore, in reduce_sum () is in accordance with the summation way matrix dimensionality reduction. Other then the function is also giving top priority to reduce the prefix, such as reduce_mean () is averaged in accordance with a dimension, and so on.
---------------------
Author: GeorgeAI
Source: CSDN
Original: https: //blog.csdn.net/georgeai/article/details/81030811
Disclaimer: This article as a blogger original article, reproduced, please attach Bowen link!

Guess you like

Origin www.cnblogs.com/jfdwd/p/11184163.html