SENet attention mechanism - pytorch implementation

Paper Portal: Squeeze-and-Excitation Networks

Purpose of SENet:

Let the network pay more attention to some important channels of the feature layer.

The method of SENet:

Use the SEblock channel attention mechanism module.

The structure of SEblock:

Alt

Use of SEblock:

SEblock can be used as a plug-and-play module, which can theoretically be added to any feature layer. The original text gives its usage in Inception Module and Residual Module .
Alt

import torch
import torch.nn as nn


class SEblock(nn.Module):  # Squeeze and Excitation block
    def __init__(self, channels, ratio=16):
        super(SEblock, self).__init__()
        channels = channels  # 输入的feature map通道数
        hidden_channels = channels // ratio  # 中间过程的通道数,原文reduction ratio设为16
        self.attn = nn.Sequential(
            nn.AdaptiveAvgPool2d(1),  # avgpool
            nn.Conv2d(channels, hidden_channels, 1, 1, 0),  # 1x1conv,替代linear
            nn.ReLU(),  # relu
            nn.Conv2d(hidden_channels, channels, 1, 1, 0),  # 1x1conv,替代linear
            nn.Sigmoid()  # sigmoid,将输出压缩到(0,1)
        )

    def forward(self, x):
        weights = self.attn(x)  # feature map每个通道的重要性权重(0,1),对应原文的sc
        return weights * x  # 将计算得到的weights与输入的feature map相乘

Guess you like

Origin blog.csdn.net/Peach_____/article/details/128677412