The parameters of the neural network: refers to the weight w on the neuron line, which is represented by variables, and these parameters are generally generated randomly first. The way to generate parameters is to make w equal to tf.Variable, and write the generation method in parentheses. The functions commonly used to generate random numbers/arrays in neural networks are:
tf.random_normal() generates normal distribution random numbers
tf.truncated_normal() generates normally distributed random numbers with excessive deviation points removed
tf.random_uniform() generates uniform distribution random number
②If there are no special requirements, the standard deviation, mean, and random seed can be omitted.
tf.random_normal() generates normal distribution random numbers
tf.truncated_normal() generates normally distributed random numbers with excessive deviation points removed
tf.random_uniform() generates uniform distribution random number
tf.zeros | Indicates that an array of all 0s is generated |
tf.ones | Indicates that an array of all 1s is generated |
tf.fill | Represents generating an array of fully definite values |
tf.constant | Represents an array that generates directly given values |
# Copyright (c)2018, Student at School of Software, Northeastern University # All rightsreserved # File name: test.py # Author: Kong Yun #Problem description: Using Tensorflow to implement the forward propagation process #coding:utf-8 # Two-layer simple neural network (full connection) import tensorflow as tf #define input and parameters x = tf.constant([[0.7, 0.5]]) w1= tf.Variable(tf.random_normal([2, 3], stddev=1, seed=1)) w2= tf.Variable(tf.random_normal([3, 1], stddev=1, seed=1)) #Define the forward propagation process, multiply the input of each layer by the weight w on the line, so that the output y can be calculated by matrix multiplication a = tf.matmul(x, w1) y = tf.matmul(a, w2) #Calculate result with session with tf.Session() as sess: init_op = tf.global_variables_initializer() #Initialize all variables sess.run (init_op) print ("the result of y is:\n",sess.run(y))
The results are as follows:
Note: This is an implementation of the neural network forward propagation process, the network can automatically infer the value of the output y.
In addition: ①The random seed will be inconsistent if the random number generated each time is removed.②If there are no special requirements, the standard deviation, mean, and random seed can be omitted.