From Perceptron neural network

Perceptron algorithm can be said to belong to a kind of bionic algorithm. As shown, the signal receiving neuron itself may pass over the layer, while afferent neurons, given a constant weight, and the internal neurons certain threshold or bias. After these signals are weighted addition, the signal obtained n, is then input in the activation function to produce an output signal a. Since the bias signal belonging to the interior of the neurons, but again with the input signal and the weighted addition, for convenience, it unites with the input. As a weighted sum of neurons and activation, so perceptron was shown to be simplified manner in FIG. Specific signal conversion equation shown in FIG.

With the Sensor by combining perceptron neural network may be configured. As shown, the input layer (P), the hidden layer (H), the output layer (a) constitutes a simple neural network.

But generally it does not bring the offset, if offset considerations, then the neural network should be particularly shown on the right.

Calculate an output from the neural network input is called forward propagation; ideal output from the error calculating the effect of each input to output, called back-propagation. FIG calculated using the calculation process can sort out good, for the neural network on the map, the corresponding calculation is shown below.

As can be seen, all weights and bias, are our input variables. The basic calculation repeated use Perceptron, we finally get the predictive value of output.

So-called back-propagation, is calculated for each offset and weighting effect size error on the output, in order to do further adjusting the weight value, so that the error is smaller. If considered from the back-propagation calculation map, it can be seen from FIG Law.

References: a text understand backpropagation neural network method --BackPropagation

Guess you like

Origin www.cnblogs.com/gshang/p/10988180.html
Recommended