Back-propagation algorithm (Back propagate) hidden layer error is calculated

Back-propagation algorithm (Back propagate) hidden layer error is calculated

Derivation has a lot of back propagation, is very simple to understand the chain rule
mostly derived mainly output layer to the hidden layer is calculated, then the error is a simple Error output output and tag target difference may be simply referred to is: E = O - T
as for the intermediate layer is mostly error skate over,
Here Insert Picture Description
there are two output result can be obtained eo1 and eo2
provided eh1 hidden layer error and EH2, then:
Here Insert Picture Description
Here Insert Picture Description
matrix is expressed as:
Here Insert Picture Description
we calculate hidden when the layer of error, it is important that each node share the final error error ratio eo1 and eo2, so the error can be written as:
Here Insert Picture Description
this understanding there is a problem, then I hope you advice.

Reference: http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html
https://www.jianshu.com/p/964345dddb70

Published 28 original articles · won praise 24 · views 10000 +

Guess you like

Origin blog.csdn.net/XM_no_homework/article/details/89886679