Classroom questions
- What does W1 and W2 mean in a 2-layer simple neural network?
Answer : W1 can still be regarded as a variety of templates in the linear classifier we learned, and represents the scores of these templates, and W2 combines these template scores to form a new complex template.
For example, in the above example, suppose we have both a left-facing horse and a right-facing horse. For a left-facing horse, one template may be higher than the other, and then the after retain its left side is a fraction of a horse.
Note: here The template score with a lower value may be omitted, and nonlinearity is also introduced, as shown in the figure 。
1. Introduction to Neural Networks
Linear function
not the same, the neural network can be seen as (hierarchical stacked) consisting of a set of complex functions simplenonlinearmapping function. For example, the simple linear function in the figure below, and the 2-layer and 3-layer simple neural network.
Single neuron analogy
The dendrites in the neuron receive the pulse signal, and then the cell body processes it, and then transmits it to the next neuron through the axon. This is similar to the calculation of a single function gate in our calculation diagram to receive input, calculation and activation with an activation function, but it is much more complicated.
For example, the common activation functions are as follows:
vectorization calculation of neurons
As mentioned above, a "neural unit" receives a lot of input, and then calculates and outputs it. Therefore, in general, we treat the input of a neuron as a vector for calculation. For example, the simple double hidden layer neural network example and its forward propagation process are illustrated below.