Machine learning to share - back-propagation algorithm is derived

Back propagation (English: Backpropagation, abbreviated as BP) is the "error back propagation" for short, and an optimization method (such as gradient descent) used in combination, a common method used to train the artificial neural network. The method in the network all the weight loss function gradient calculation. This will be fed back to the gradient optimization method for updating the weights to minimize the loss function.

Many students in learning the depth of neural networks, details of the back-propagation does not understand, there is a foreign technology blog, had a very clear derivation by example. We were finished, and provides relevant code. Interested students Check it out.

The relevant code

Original Address

Suppose you have such a network layer.

Now the initial value assigned to them, as shown below:

Forward propagation

1. The input layer ----> hidden layer:

2. hidden layers ----> Output layer:

Back-propagation process

Then, we can calculate the backpropagation

1. The calculation of the error

2. hidden layer ----> output layer value update:

image.png

The following may be more intuitive FIG see how the error back propagation

Our values ​​were calculated for each formula:

The last three multiplied by

Take a look at the formula above, we find:

3. hidden layer ----> right to update the value of the hidden layer:

Similarly, to calculate the

The two together give a total value

Finally, multiplied by three

image.png

Such error back-propagation method is complete, the last right we then update the value recalculated kept iteration.

PC end complete code

------------------------------------------- Mo (URL: HTTP: / /momodel.cn  ) is a modeling platform supported Python artificial intelligence, training can help you quickly develop and deploy AI applications.

Published 36 original articles · won praise 4 · views 10000 +

Guess you like

Origin blog.csdn.net/weixin_44015907/article/details/89647380