トーチサマリは、モデルが cuda に渡されるかどうかに注意を払う必要があり、デバイス パラメータと一致している必要があります。

トーチサマリは、モデルが cuda に渡されるかどうかに注意を払う必要があり、デバイス パラメータと一致している必要があります。

参考 https://www.csdn.net/tags/OtDaMg0sMjg2OTEtYmxvZwO0O0OO0O0O.html

torchsummary.summary(model, input_size, batch_size=-1, device="cuda")
model:pytorch 模型,必须继承自 nn.Module
input_size:模型输入 size,形状为 C,H ,W
batch_size:batch_size,默认为 -1,在展示模型每层输出的形状时显示的 batch_size
device:"cuda"或者"cpu"
使用时需要注意,默认device=‘cuda’,如果是在‘cpu’,那么就需要更改。不匹配就会出现下面的错误:
# RuntimeError: Input type (torch.cuda.FloatTensor) and weight type (torch.FloatTensor) should be the same

1 モデルもデバイスも cuda に渡されません

from torchsummary import summary
from Unet import Unet

model = Unet()
summary(model=model, input_size=(3,256,256),batch_size=2, device="cpu")


# 结果
----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1           [2, 8, 256, 256]             224
              ReLU-2           [2, 8, 256, 256]               0
            Conv2d-3           [2, 8, 256, 256]             584
              ReLU-4           [2, 8, 256, 256]               0
         MaxPool2d-5           [2, 8, 128, 128]               0
            Conv2d-6          [2, 16, 128, 128]           1,168
              ReLU-7          [2, 16, 128, 128]               0
            Conv2d-8          [2, 16, 128, 128]           2,320
              ReLU-9          [2, 16, 128, 128]               0
        MaxPool2d-10            [2, 16, 64, 64]               0
           Conv2d-11            [2, 32, 64, 64]           4,640
             ReLU-12            [2, 32, 64, 64]               0
           Conv2d-13            [2, 32, 64, 64]           9,248
             ReLU-14            [2, 32, 64, 64]               0
        MaxPool2d-15            [2, 32, 32, 32]               0
           Conv2d-16            [2, 64, 32, 32]          18,496
             ReLU-17            [2, 64, 32, 32]               0
           Conv2d-18            [2, 64, 32, 32]          36,928
             ReLU-19            [2, 64, 32, 32]               0
        MaxPool2d-20            [2, 64, 16, 16]               0
           Conv2d-21           [2, 128, 16, 16]          73,856
             ReLU-22           [2, 128, 16, 16]               0
           Conv2d-23           [2, 128, 16, 16]         147,584
             ReLU-24           [2, 128, 16, 16]               0
        MaxPool2d-25             [2, 128, 8, 8]               0
           Conv2d-26             [2, 256, 8, 8]         295,168
             ReLU-27             [2, 256, 8, 8]               0
           Conv2d-28             [2, 256, 8, 8]         590,080
             ReLU-29             [2, 256, 8, 8]               0
        MaxPool2d-30             [2, 256, 4, 4]               0
           Conv2d-31             [2, 512, 4, 4]       1,180,160
             ReLU-32             [2, 512, 4, 4]               0
           Conv2d-33             [2, 512, 4, 4]       2,359,808
             ReLU-34             [2, 512, 4, 4]               0
        MaxPool2d-35             [2, 512, 2, 2]               0
           Conv2d-36            [2, 1024, 2, 2]       4,719,616
             ReLU-37            [2, 1024, 2, 2]               0
           Conv2d-38            [2, 1024, 2, 2]       9,438,208
             ReLU-39            [2, 1024, 2, 2]               0
  ConvTranspose2d-40             [2, 512, 4, 4]       8,388,608
             ReLU-41             [2, 512, 4, 4]               0
           Conv2d-42             [2, 512, 4, 4]       4,719,104
             ReLU-43             [2, 512, 4, 4]               0
           Conv2d-44             [2, 512, 4, 4]       2,359,808
             ReLU-45             [2, 512, 4, 4]               0
  ConvTranspose2d-46             [2, 256, 8, 8]       2,097,152
             ReLU-47             [2, 256, 8, 8]               0
           Conv2d-48             [2, 256, 8, 8]       1,179,904
             ReLU-49             [2, 256, 8, 8]               0
           Conv2d-50             [2, 256, 8, 8]         590,080
             ReLU-51             [2, 256, 8, 8]               0
  ConvTranspose2d-52           [2, 128, 16, 16]         524,288
             ReLU-53           [2, 128, 16, 16]               0
           Conv2d-54           [2, 128, 16, 16]         295,040
             ReLU-55           [2, 128, 16, 16]               0
           Conv2d-56           [2, 128, 16, 16]         147,584
             ReLU-57           [2, 128, 16, 16]               0
  ConvTranspose2d-58            [2, 64, 32, 32]         131,072
             ReLU-59            [2, 64, 32, 32]               0
           Conv2d-60            [2, 64, 32, 32]          73,792
             ReLU-61            [2, 64, 32, 32]               0
           Conv2d-62            [2, 64, 32, 32]          36,928
             ReLU-63            [2, 64, 32, 32]               0
  ConvTranspose2d-64            [2, 32, 64, 64]          32,768
             ReLU-65            [2, 32, 64, 64]               0
           Conv2d-66            [2, 32, 64, 64]          18,464
             ReLU-67            [2, 32, 64, 64]               0
           Conv2d-68            [2, 32, 64, 64]           9,248
             ReLU-69            [2, 32, 64, 64]               0
  ConvTranspose2d-70          [2, 16, 128, 128]           8,192
             ReLU-71          [2, 16, 128, 128]               0
           Conv2d-72          [2, 16, 128, 128]           4,624
             ReLU-73          [2, 16, 128, 128]               0
           Conv2d-74          [2, 16, 128, 128]           2,320
             ReLU-75          [2, 16, 128, 128]               0
  ConvTranspose2d-76           [2, 8, 256, 256]           2,048
             ReLU-77           [2, 8, 256, 256]               0
           Conv2d-78           [2, 8, 256, 256]           1,160
             ReLU-79           [2, 8, 256, 256]               0
           Conv2d-80           [2, 8, 256, 256]             584
             ReLU-81           [2, 8, 256, 256]               0
           Conv2d-82           [2, 1, 256, 256]              73
          Sigmoid-83           [2, 1, 256, 256]               0
================================================================
Total params: 39,500,929
Trainable params: 39,500,929
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 1.50
Forward/backward pass size (MB): 164.97
Params size (MB): 150.68
Estimated Total Size (MB): 317.15
----------------------------------------------------------------

2 モデルとデバイスが cuda に渡される

from torchsummary import summary
from Unet import Unet

model = Unet().cuda()
summary(model=model, input_size=(3,256,256),batch_size=2)  # 默认device="cuda"


# 结果
----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1           [2, 8, 256, 256]             224
              ReLU-2           [2, 8, 256, 256]               0
            Conv2d-3           [2, 8, 256, 256]             584
              ReLU-4           [2, 8, 256, 256]               0
         MaxPool2d-5           [2, 8, 128, 128]               0
            Conv2d-6          [2, 16, 128, 128]           1,168
              ReLU-7          [2, 16, 128, 128]               0
            Conv2d-8          [2, 16, 128, 128]           2,320
              ReLU-9          [2, 16, 128, 128]               0
        MaxPool2d-10            [2, 16, 64, 64]               0
           Conv2d-11            [2, 32, 64, 64]           4,640
             ReLU-12            [2, 32, 64, 64]               0
           Conv2d-13            [2, 32, 64, 64]           9,248
             ReLU-14            [2, 32, 64, 64]               0
        MaxPool2d-15            [2, 32, 32, 32]               0
           Conv2d-16            [2, 64, 32, 32]          18,496
             ReLU-17            [2, 64, 32, 32]               0
           Conv2d-18            [2, 64, 32, 32]          36,928
             ReLU-19            [2, 64, 32, 32]               0
        MaxPool2d-20            [2, 64, 16, 16]               0
           Conv2d-21           [2, 128, 16, 16]          73,856
             ReLU-22           [2, 128, 16, 16]               0
           Conv2d-23           [2, 128, 16, 16]         147,584
             ReLU-24           [2, 128, 16, 16]               0
        MaxPool2d-25             [2, 128, 8, 8]               0
           Conv2d-26             [2, 256, 8, 8]         295,168
             ReLU-27             [2, 256, 8, 8]               0
           Conv2d-28             [2, 256, 8, 8]         590,080
             ReLU-29             [2, 256, 8, 8]               0
        MaxPool2d-30             [2, 256, 4, 4]               0
           Conv2d-31             [2, 512, 4, 4]       1,180,160
             ReLU-32             [2, 512, 4, 4]               0
           Conv2d-33             [2, 512, 4, 4]       2,359,808
             ReLU-34             [2, 512, 4, 4]               0
        MaxPool2d-35             [2, 512, 2, 2]               0
           Conv2d-36            [2, 1024, 2, 2]       4,719,616
             ReLU-37            [2, 1024, 2, 2]               0
           Conv2d-38            [2, 1024, 2, 2]       9,438,208
             ReLU-39            [2, 1024, 2, 2]               0
  ConvTranspose2d-40             [2, 512, 4, 4]       8,388,608
             ReLU-41             [2, 512, 4, 4]               0
           Conv2d-42             [2, 512, 4, 4]       4,719,104
             ReLU-43             [2, 512, 4, 4]               0
           Conv2d-44             [2, 512, 4, 4]       2,359,808
             ReLU-45             [2, 512, 4, 4]               0
  ConvTranspose2d-46             [2, 256, 8, 8]       2,097,152
             ReLU-47             [2, 256, 8, 8]               0
           Conv2d-48             [2, 256, 8, 8]       1,179,904
             ReLU-49             [2, 256, 8, 8]               0
           Conv2d-50             [2, 256, 8, 8]         590,080
             ReLU-51             [2, 256, 8, 8]               0
  ConvTranspose2d-52           [2, 128, 16, 16]         524,288
             ReLU-53           [2, 128, 16, 16]               0
           Conv2d-54           [2, 128, 16, 16]         295,040
             ReLU-55           [2, 128, 16, 16]               0
           Conv2d-56           [2, 128, 16, 16]         147,584
             ReLU-57           [2, 128, 16, 16]               0
  ConvTranspose2d-58            [2, 64, 32, 32]         131,072
             ReLU-59            [2, 64, 32, 32]               0
           Conv2d-60            [2, 64, 32, 32]          73,792
             ReLU-61            [2, 64, 32, 32]               0
           Conv2d-62            [2, 64, 32, 32]          36,928
             ReLU-63            [2, 64, 32, 32]               0
  ConvTranspose2d-64            [2, 32, 64, 64]          32,768
             ReLU-65            [2, 32, 64, 64]               0
           Conv2d-66            [2, 32, 64, 64]          18,464
             ReLU-67            [2, 32, 64, 64]               0
           Conv2d-68            [2, 32, 64, 64]           9,248
             ReLU-69            [2, 32, 64, 64]               0
  ConvTranspose2d-70          [2, 16, 128, 128]           8,192
             ReLU-71          [2, 16, 128, 128]               0
           Conv2d-72          [2, 16, 128, 128]           4,624
             ReLU-73          [2, 16, 128, 128]               0
           Conv2d-74          [2, 16, 128, 128]           2,320
             ReLU-75          [2, 16, 128, 128]               0
  ConvTranspose2d-76           [2, 8, 256, 256]           2,048
             ReLU-77           [2, 8, 256, 256]               0
           Conv2d-78           [2, 8, 256, 256]           1,160
             ReLU-79           [2, 8, 256, 256]               0
           Conv2d-80           [2, 8, 256, 256]             584
             ReLU-81           [2, 8, 256, 256]               0
           Conv2d-82           [2, 1, 256, 256]              73
          Sigmoid-83           [2, 1, 256, 256]               0
================================================================
Total params: 39,500,929
Trainable params: 39,500,929
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 1.50
Forward/backward pass size (MB): 164.97
Params size (MB): 150.68
Estimated Total Size (MB): 317.15
----------------------------------------------------------------

# 输出说明
# 上述信息分别有模型每层的输出形状,每层的参数数量,总的参数数量,以及模型大小等信息。

おすすめ

転載: blog.csdn.net/LIWEI940638093/article/details/126722031