Depth articles - classic History of Deep Learning Network model (c) elaborate structure and characteristics of the ZF-Net

Skip to main content

Return to Deep Learning Network model development history of classic catalog

Previous: depth articles - classic history of Deep Learning Network model development (b)  elaborate structure and features AlexNet

Next: depth articles - classic History of Deep Learning Network model (d)  elaborate structure and features VGG16

 

In this section, ZF-Net elaborate structure and characteristics, the next section elaborate structure and features VGG16

 

Papers Address: Visualizing and Understanding Convolutional Networks

 

II. Network Classic (Classic Network)

3. ZF-Net

ZF-Net is a 2013 winner of ImageNet classification tasks, its network structure is not much improvement, but adjust the parameters, the large convolution kernels removed a bit smaller, but performance than AlexNet improved a lot.

(1) Network Description:\large input(-1, \; 224, \; 224, \; 3)

 

   Network architecture:

   Deconvolution, anti-tank structure:

      ZF-Net uses deconvolution inverse pooled to view the extracted features results in FIG. These effects and with reference to FIG perform parameter adjustment.

 

. (2) ZF-Net features:

   ①. ZF-Net is based on fine-tuning AlexNet

      a. AlexNet difference in performance with the sparse two connecting structure of the GPU, and ZF-Net only a dense structure is connected to a GPU.

      . AlexNet first layer B changes: the size of the filter is about to  11 \times 11 become  \large 7 \times 7, and the stride  \large 4 \times 4 becomes  \large 2 \times 2.

   ②. ZF-Net compared with AlexNet, using a front layer of less convolution kernel, and smaller steps, it retains more features.

   ③. Using deconvolution, visualization feature map. As can be seen by the feature map, characterized in hierarchical architecture. In front of a physical layer learning profile, edge, color, texture and other characteristics, the latter layer learning features and abstract class dependent. The more high-level features that used to make the classification performance is better; and low-level features, the fight for better box performance.

   ④. By blocking find the absolute key parts of an image category. Experiments described depth increases, the network can learn to distinguish more features.

   ⑤. When network training, low-level parameters of fast convergence, more senior, need longer time to train, to converge.

 

 

 

                  

 

Skip to main content

Return to Deep Learning Network model development history of classic catalog

Previous: depth articles - classic history of Deep Learning Network model development (b)  elaborate structure and features AlexNet

Next: depth articles - classic History of Deep Learning Network model (d)  elaborate structure and features VGG16

Published 63 original articles · won praise 16 · views 6001

Guess you like

Origin blog.csdn.net/qq_38299170/article/details/104241797