Detailed tutorial on convolutional neural network, basic operations of convolutional neural network

What is the convolutional neural network algorithm?

One-dimensional construction, two-dimensional construction, and full convolution construction.

Convolutional Neural Networks (CNN) are a type of feedforward neural networks that include convolutional calculations and have a deep structure. It is one of the representative algorithms of deep learning.

Convolutional neural networks have representation learning capabilities and can perform shift-invariant classification of input information according to its hierarchical structure. Therefore, they are also called "Shift-Invariant Artificial Neural Networks". Neural Networks, SIANN)”.

Connectivity of convolutional neural networks: The connections between convolutional layers in convolutional neural networks are called sparse connections. That is, compared to the full connections in feedforward neural networks, the neurons in the convolutional layers are only Connected to some, but not all, of the neurons in its adjacent layer.

Specifically, any pixel (neuron) in the feature map of layer l of the convolutional neural network is only a linear combination of pixels in the receptive field defined by the convolution kernel in layer l-1.

The sparse connections of the convolutional neural network have a regularization effect, which improves the stability and generalization ability of the network structure and avoids overfitting. At the same time, the sparse connections reduce the total amount of weight parameters, which is conducive to the rapid learning of the neural network. and reduce memory overhead during calculations.

In a convolutional neural network, all pixels in the same channel of the feature map share a set of convolution kernel weight coefficients. This property is called weight sharing.

Weight sharing distinguishes convolutional neural networks from other neural networks that contain local connection structures. Although the latter uses sparse connections, the weights of different connections are different. Like sparse connections, weight sharing reduces the total number of parameters of the convolutional neural network and has a regularization effect.

From the perspective of fully connected networks, the sparse connections and weight sharing of convolutional neural networks can be regarded as two infinitely strong priors (pirior), that is, all weight coefficients of a hidden layer neuron outside its receptive field are constant. is 0 (but the receptive field can move in space); and within a channel, the weight coefficients of all neurons are the same.

Google Artificial Intelligence Writing Project: Neural Network Pseudo-Original

Guess you like

Origin blog.csdn.net/aifamao2/article/details/127443890