Depth articles - classic History of Deep Learning Network model (seven) Inception-ResNet elaborate structure and features

 

Skip to main content

Return to Deep Learning Network model development history of classic catalog

Previous: depth articles - classic History of Deep Learning Network model (six)  elaborate structure and characteristics of ResNet

Next: depth articles - classic History of Deep Learning Network model (eight)  elaborate structure and features DenseNet

 

In this section, Inception-ResNet elaborate structure and characteristics, the next section elaborate structure and features DenseNet

 

论文地址:Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning

 

II. Network Classic (Classic Network)

7. Inception-ResNet

(1). Inception-ResNet characterized Neural Network Graphics


    (2) .Inception-v4 block, Inception-v4 block and Inception-ResNet block, as well as their network structure

Inception-v4, Inception-ResNet-v1 and Inception-ResNet-v2 between them, similar to the structure of the overall network, but block form a network structure was different. So to think about it, easier to understand the network structure.

 

. (3) Inception-ResNet characteristics:

   ①. In Inception-ResNet in the main Inception-ResNet V1 and Inception-ResNet V2. Inception-ResNet V1 corresponding Inception-V3 be ResNet transformation. Inception-ResNet V2 corresponding Inception-V4 were ResNet transformation.

   ②. The authors found that the number of over 1000 if the filter, residual network becomes unstable, the network will "die" in early training, which means that after tens of thousands of iterations, avg pooling before Finally, several layers of network parameters are all zero. The solution is to either reduce the learning rate, or add additional BN these layers (if BN can not be saved, then use the following scale to remedy). Authors also found that, if the residual portion of the zoom before adding with needs adding layers make the network more stable in the training process. Thus a number of selected scaling factor between 0.1 and 0.3, with the scaling factor to scale the residual network, then do an adder. Use the zoom factor is not strictly necessary, it does not harm the network, the network will help train more stable.

   ③. Compare Network

       Data provided in the paper can be seen Inception-V4 and Inception-ResNet V2 of the difference is not great, but better than Inception-V3 and Inception-ResNet V1 much better

      a. In comparison Inception-ResNet V1 and Inception-V3's, Inception-ResNet V1 Although the training is faster, but the final result than the Inception-V3 a little bit worse

      b. In comparison Inception-ResNet V2 and the Inception-V4, Inception-ResNet V2 faster training, fewer layers, but a little better results than Inception-V4. So in the end winning is Inception-ResNet V2.

 

 

 

 

 

 

 

 

                  

 

Skip to main content

Return to Deep Learning Network model development history of classic catalog

Previous: depth articles - classic History of Deep Learning Network model (six)  elaborate structure and characteristics of ResNet

Next: depth articles - classic History of Deep Learning Network model (eight)  elaborate structure and features DenseNet

Published 63 original articles · won praise 16 · views 5997

Guess you like

Origin blog.csdn.net/qq_38299170/article/details/104241843