tensorflow notes: Summary of functions in the novice learning process

1、tf.Variable():

tf.Variable(initial_value, trainable=True, collections=None, validate_shape=True, name=None)

initial_value All types that can be converted to Tensor the initial value of the variable
trainable bool If True, it will be added to GraphKeys.TRAINABLE_VARIABLES to use Optimizer on it
collections list Specify the type of the graph variable, the default is [GraphKeys.GLOBAL_VARIABLES]
validate_shape bool If False, no type and dimension checking is done
name string The name of the variable, if not specified then the system will automatically assign a unique value

Create a node in the graph .

Attachment: Functions that are generally used as parameters in Variable():

tf.truncated_normal(shape, mean, stddev) : shape represents the dimension of the generated tensor, mean is the mean, and stddev is the standard deviation. This function produces a normal distribution with the mean and standard deviation set by itself. This is a truncated function that produces a normal distribution, meaning that values ​​that produce a normal distribution are regenerated if they differ from the mean by more than twice the standard deviation. Compared with the random data generated by the general normal distribution, the random number generated by this function will not be more than twice the standard deviation from the mean, but other general functions are possible.

tf.random_normal(shape, mean, stddev): Output random values ​​from a normal distribution.

tf.constant(value, dtype=None, shape=None, name='Const') creates a constant tensor, which is assigned according to the given value, and shape can be used to specify its shape. value can be a number or a list. If it is a number, then all values ​​in this constant light are assigned according to the number. If it is a list, then len(value) must be less than or equal to the length of the expanded shape. When assigning, first store the values ​​in value one by one. If the part is not enough, it is all stored in the last value of value. 


2. Activation function

The role of activation function: it can add some nonlinear factors to the neural network, so that the neural network can better solve more complex problems. The activation function does not change the dimension.

tf.nn.relu()

tf.nn.sigmoid()

tf.nn.tanh()

tf.nn.elu ()

tf.nn.bias_add()

tf.nn.crelu()

tf.nn.relu6()

tf.nn.softplus()

tf.nn.softsign()

tf.nn.dropout()

tf.nn.relu_layer(x, weights, biases,name=None)

3. Exponential Decay Learning Rate

learning_rate = tf.train.exponential_decay(LEARNING_RATE_BASE,global_step,LEARNING_RATE_STEP,LEARNING_RATE_DECAY,staircase=True)

Learning rate calculation mathematical formula: Learning_rate=LEARNING_RATE_BASE*LEARNING_RATE_DECAY* (

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324844791&siteId=291194637