Back-propagation algorithm (Back propagate) hidden layer error is calculated
Derivation has a lot of back propagation, is very simple to understand the chain rule
mostly derived mainly output layer to the hidden layer is calculated, then the error is a simple Error output output and tag target difference may be simply referred to is: E = O - T
as for the intermediate layer is mostly error skate over,
there are two output result can be obtained eo1 and eo2
provided eh1 hidden layer error and EH2, then:
matrix is expressed as:
we calculate hidden when the layer of error, it is important that each node share the final error error ratio eo1 and eo2, so the error can be written as:
this understanding there is a problem, then I hope you advice.
Reference: http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html
https://www.jianshu.com/p/964345dddb70