How to understand the backpropagation algorithm

Author: Zen and the Art of Computer Programming

1 Introduction

  The backpropagation algorithm of the neural network is a very important algorithm. Its basic principle is to adjust each neuron in the output layer of the neural network according to the loss function, so that the neural network model can be better during the training process. fit the training data. The core idea of ​​the backpropagation algorithm is to use the method of error backpropagation, that is, to calculate the weight adjustment value between each hidden layer node and the input layer sequentially from the last layer to the first layer, and apply these adjustment values ​​to each One weight, thereby updating the weight parameters of the entire neural network, making the final model better.

  In 1986, Rumelhart, Hinton and Williams proposed the backpropagation algorithm together. In 1988, Gersho et al. were also working on this algorithm, but their method was slightly more complicated, so we mainly discuss the most original version here - the backpropagation algorithm.

2. Explanation of basic concepts and terms

(1) Neural Network

  Neural Network (NN, Neural Network) refers to a fuzzy network structure composed of multiple interconnected simple neurons. Each neuron receives an input signal, weights it, and then activates the output signal through an activation function or nonlinearity. The complex result of the interaction between each neuron is called a "multilayer perceptron" or "deep neural network". The basic unit of a neural network is a neuron, which can weight input information and obtain output information.

(2) Input Layer (Input Layer)

  The input layer refers to the features of the samples to be predicted. The input layer generally includes several nodes

Guess you like

Origin blog.csdn.net/universsky2015/article/details/132507504