Convolution neural network - pooling Summary

Pooling concept is presented in AlexNet network, called the prior downsampling;

Pooling in the end what to do, not much explanation;

 

The role of pool

First need to identify some of the pool where it occurred: the feature map formed after activation function convolution, i.e. Relu (wx + b), followed by the pooling layer

1. Pooling may be understood as a dimension reduction visualize

2. Pooling avoid the effect of local displacement or slight misalignment caused, improve the robustness of the model

3. reduce the pool of model parameters, improve the training speed

4. A model for the attention that pooling typical characteristics, such as max, to improve the accuracy of the model, to a certain extent to avoid overfitting

 

Pooling way

Two ways, max_pooling and mean_pooling, not much explanation

 

padding

Unlike pooled padding padding convolution, the following description separately

When padding taken Same, may give planar boundary filling, but not guarantee the same size, at the time of the pool of wild scanning, if the scanning to the boundary, the remaining number of cells is smaller than the field size pooling, boundary fill, so that the remaining lattice equal to the number of field size pool, you will not need to fill.

When padding Valid taken, not filled, when the boundary scan if the remaining number of cells is smaller than the size of the pool of the field, the number of remaining cells to give

 

Why max_pooling better than mean_pooling

max_pooling also proposed in AlexNet in the past, it's significantly better than mean_pooling, it has been pooled as the preferred way

max_pooling advantages:

1. mean_pooling is a linear transformation taking the mean [-] linear, max_pooling non-linear transformation is taken max- [] linear, strong expression model

2. max_pooling equivalent to a reduction of neurons are activated, similar dropout, reducing both parameters, but also to prevent over-fitting

3. From the emotional point of view, max_pooling only concerned with typical features, giving up ordinary features, help to improve the accuracy of the model

 

An overlapping pool of

Pooling also overlap in AlexNet proposed to, in fact, very simple,

Pooling is not overlapped with the previous field and the next pool of pools of wild no intersection, i.e. the pool size of the field is equal to a stride of, that of the overlapping pool size> stride;

Practice shows that the overlapping pool of + max_pooling well to prevent overfitting, improve performance model

 

Pooling of back-propagation

max_pooling backpropagation gradient constant at 1,, mean_pooling backpropagation gradient 1 / size, see below

 

 

 

References:

Guess you like

Origin www.cnblogs.com/yanshw/p/12208549.html