pytorch in BatchNorm2d usage

CLASS torch.nn.BatchNorm2d(num_features, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)

The basic principle is:
Here Insert Picture Description
As shown, the aim is to make block normalized reasonable distribution of data transmission, accelerate the process of training.
Input data is a four-dimensional (N, C, H, W ), batch size N- entered, C is the channel number of the input image, (H, W) is the size of the input image.

  • num_features - 100

  • eps default 1e-5, added to the data held stable in the denominator (the error does not occur denominator 0)

  • momentum default 0.1, estimates of the mean and variance in operation
    Here Insert Picture Description

  • affine default is True, the learning parameter represents (i.e., γ and beta])

  • track_running_stats default True, indicates the operating estimates of the mean and variance of
    track_running_stats = True representation batch tracking statistical properties of the entire training process, get the variance and mean, and not just rely solely on the statistical characteristics of the input current batch. Conversely, if track_running_stats = False Then just calculate the statistical properties of the current batch input in the mean and variance.

Released two original articles · won praise 9 · views 165

Guess you like

Origin blog.csdn.net/qq_45171138/article/details/104521379