Expected more than 1 value per channel when training, got input size torch.Size([1, 9, 1, 1])

ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 9, 1, 1])
原代码:

import torch
import torch.nn as nn
model = nn.Sequential(
            nn.Conv2d(3, 9, 1, 1, 0, bias=False),
            nn.BatchNorm2d(9),
            nn.ReLU(inplace=True),
            nn.AdaptiveAvgPool2d((1, 1)),
            nn.BatchNorm2d(9)
        )
        
x = torch.rand([1, 3, 224, 224])
for i in range(len(model)):
    x = model[i](x)
    print(i,x.shape)

报错:

Traceback (most recent call last):
  File "/home/jim/.local/lib/python3.5/site-packages/torch/nn/functional.py", line 1693, in batch_norm
0 torch.Size([1, 9, 224, 224])
1 torch.Size([1, 9, 224, 224])
2 torch.Size([1, 9, 224, 224])
3 torch.Size([1, 9, 1, 1])
    raise ValueError('Expected more than 1 value per channel when training, got input size {}'.format(size))
ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 9, 1, 1])

发现问题出在最后一层 nn.BatchNorm2d(9),输入数据的 batch 必须大于 1.

测试模型时的解决办法
在使用单张图片测试模型时,需要加上eval()属性,如model.eval()modle.eval()会固定BN和dropout层,使得偏置参数不随着发生变化。否则,batch_size为1,会向上面一样提示维度太小。

训练模型时的解决办法
x = torch.rand([1, 3, 224, 224])
更改为x = torch.rand([2, 3, 224, 224])

得到正常输出

0 torch.Size([2, 9, 224, 224])
1 torch.Size([2, 9, 224, 224])
2 torch.Size([2, 9, 224, 224])
3 torch.Size([2, 9, 1, 1])
4 torch.Size([2, 9, 1, 1])

另外,如果不用自适应池化,且输入的batch仍设为1:

import torch
import torch.nn as nn
model = nn.Sequential(
            nn.Conv2d(3, 9, 1, 1, 0, bias=False),
            nn.BatchNorm2d(9),
            nn.ReLU(inplace=True),
            nn.BatchNorm2d(9)
        )
  
  x = torch.rand([1, 3, 224, 224])
for i in range(len(model)):
    x = model[i](x)
    print(i,x.shape)

也不会报错

0 torch.Size([1, 9, 224, 224])
1 torch.Size([1, 9, 224, 224])
2 torch.Size([1, 9, 224, 224])
3 torch.Size([1, 9, 224, 224])

自适应平均池化后将224*224变为1*1,说明batch为1时,如果输入为1*1的特征图,则,不能正常通过 nn.BatchNorm2d。

猜你喜欢

转载自blog.csdn.net/weixin_41735859/article/details/105867491