Depth study data pre - approved for Standardization (Batch Normalization)

The most common method of data preprocessing and standardization is the center, the center corresponding to the center position of the correction data, the implementation is very simple, that is, subtracting the mean value in each corresponding feature dimensions, and finally to obtain mean feature 0. Standardization is very simple, after the data becomes zero mean, to enable different features have the same size dimensions, may be approximated by dividing the standard deviation of a standard normal distribution, it can also be based on the maximum and minimum converted to - 1 ~ 1.

2015 This paper presents a method, approved standardized, in a nutshell, is the output for each layer of the network, its done a normalized, so obey the standard normal distribution, so the network layer enter also a standard normal distribution, it is possible to better training, speed up the convergence.

In pytorch not achieve in their own batch of standardization, there are built-in functions to achieve, such as two-dimensional BatchNorm2d ()

# 使用批标准化
class conv_bn_net(nn.Module):
    def __init__(self):
        super(conv_bn_net, self).__init__()
        self.stage1 = nn.Sequential(
            nn.Conv2d(1, 6, 3, padding=1),
            nn.BatchNorm2d(6),
            nn.ReLU(True),
            nn.MaxPool2d(2, 2),
            nn.Conv2d(6, 16, 5),
            nn.BatchNorm2d(16),
            nn.ReLU(True),
            nn.MaxPool2d(2, 2)
        )
        
        self.classfy = nn.Linear(400, 10)
    def forward(self, x):
        x = self.stage1(x)
        x = x.view(x.shape[0], -1)
        x = self.classfy(x)
        return x

Published 302 original articles · won praise 161 · views 490 000 +

Guess you like

Origin blog.csdn.net/qq_32146369/article/details/104088897