Translation translation invariance and variability of understanding, a clearer explanation of RFCN article

Translation translation invariance and variability of understanding, a clearer explanation of RFCN article

We believe that the above design is unreasonable due to the conflict between the translation transformation of the picture classification zo long translation invariance and target detection. On the one hand, the level of image classification task focuses on translational invariance (translation of an object in a picture without changing its determination result), and therefore the depth of the network structure while maintaining full convolution translation invariance is the best as far as possible, by good performance in the ImageNet classification is demonstrated
 
 on the other hand, the target position represented by the detection task requires, to some extent be converted is a translation (translation-variant). For example, a candidate box object translation should produce a meaningful feedback to describe the degree of overlap of the object and the candidate blocks. We assume that in the picture classification networks, convolution layer deeper, less sensitive to translate. To solve this dilemma (translation transform translational invariance and classification target image detection), ResNet in the literature [Detection Algorithm] is inserted in the ROI pooling layer convolutional layer - specific operations to destroy this region translation invariant, and when evaluating different regions, post-Rol convolutional layer is no longer a translationally invariant. This design is sacrificing the efficiency of training and testing, because it introduces a large number of region-wise layers
 
 

This is because there are some roi are similar , there are some repetitive calculation roi repeated in subsequent calculations


 
 
  https://blog.csdn.net/qq_30622831/article/details/81455550?depth_1-utm_source=distribute.pc_relevant.none-task&utm_source=distribute.pc_relevant.none-task

Guess you like

Origin www.cnblogs.com/lishikai/p/12424132.html