Depth articles - classic history of Deep Learning Network model development (a) graph structure and LeNet-5 structure and features about the history of the development model

Skip to main content

Return to Deep Learning Network model development history of classic catalog

Next: depth articles - classic history of Deep Learning Network model development (b)    elaborate structure and features AlexNet

 

Directory Contents

Depth articles - classic history of Deep Learning Network model development (a)  graph structure and LeNet-5 structure and features about the history of the development model

Depth articles - Deep History of Classical Learning Network model (b)  elaborate structure and features AlexNet

Depth articles - classic History of Deep Learning Network model (c)  elaborate structure and characteristics of the ZF-Net

Depth articles - classic History of Deep Learning Network model (d)  elaborate structure and features VGG16

Depth articles - classic History of Deep Learning Network model (five)  elaborate GoogleNet structure and characteristics (including v1, v2, v3)

Depth articles - classic History of Deep Learning Network model (six)  elaborate structure and characteristics of ResNet

Depth articles - classic History of Deep Learning Network model (seven)  elaborate Inception-ResNet structure and features

Depth articles - classic history of the network model developed Deep Learning (h)  elaborate structure and features DenseNet

Depth articles - Deep Learning classic history of the development of the network model (ix)  elaborate DarkNet structure and characteristics of each model and with precision

 

In this section, on the history of the development of the graphical model structure and LeNet-5 structure and characteristics, the next section structure and features elaborate AlexNet

 

A graphical structure of FIG.

 

II. Network Classic (Classic Network)

1. LeNet-5

 

论文地址: Gradient-Based Learning Applied to Document Recognition

 

LeNet-5 convolution neural network is LeCun in 1998 proposed to solve handwritten numeral recognition of visual tasks. I did not think the use of padding or padding to "VALID". Since this is a more classical networks, and now basically do not have the depth of neural networks. Talk about the following properties, do not tell the details. Interested friends can go read up on paper.

(1) Network Description:\large (input: 32 \times 32)

   In fact, it can be seen from the above structure, the input is a 32x32, in fact, 28x28 are possible, but the first layer needs to be changed SAME enough padding. Of course, due to the small LeNet-5 network, relatively, the model is also very small, ckpt model file less than 1M, if frozen pb model file, smaller model, for small memory for mobile side, sometimes on LeNet-5 It became gospel.

    Network architecture:

  

 

   The training process is as follows:

    As can be seen from the chart, the training effect at the time, is still quite possible. test loss of about 1%. And before I: depth articles - CNN convolutional neural network (iv) using tf cnn be mnist handwritten numeric code demo project  just about training, acc to rise to 98% +, and then tune about parameters, or early-stopping and fine -tuning look, acc rose to 99% should not be a problem.

 

. (2) LeNet-5 features:

   . ① convolution Each layer consists of three parts: a convolution, pooling and nonlinear activation function

   ②. Spatial feature extraction using convolution

   ③. Downsampling (subsample) average cell layer (Average pooling)

   ④. Hyperbolic tangent (Tanh) or S type (the Sigmoid) activation function, Softmax as the final classifier

   ⑤. Sparse connection between the layers to reduce the computational complexity.

 

 

                  

 

Skip to main content

Return to Deep Learning Network model development history of classic catalog

Next: depth articles - classic history of Deep Learning Network model development (b)    elaborate structure and features AlexNet

Published 63 original articles · won praise 16 · views 5978

Guess you like

Origin blog.csdn.net/qq_38299170/article/details/104241446