用于一维序列(1D)的SE Block的Tensorflow实现

Squeeze-and-Excitation (SE) 模块或者称为SEnet

def SE_Block(input_tensor,ratio = 16):
    input_shape = K.int_shape(input_tensor)
    squeeze = tf.keras.layers.GlobalAveragePooling1D()(input_tensor)
    excitation = tf.keras.layers.Dense(units = input_shape[-1]//ratio, kernel_initializer='he_normal',activation='relu')(squeeze)
    excitation = tf.keras.layers.Dense(units = input_shape[-1],activation='sigmoid')(excitation)
    #excitation = tf.reshape(excitation, [-1, 1, input_shape[-1]])
    scale = tf.keras.layers.Multiply()([input_tensor, excitation])
    return scale

#X = tf.random.uniform((32, 352))
X = tf.keras.Input(shape=(32,352))
Y = SE_Block(X,16)
model = tf.keras.Model(inputs=[X], outputs=[Y])
model.summary()
print(Y.shape)

一维情况下的SE Block
输入是(Batch size,squence length,num of channels)
ratio是压缩率,这里是16
比如说输入是352个channel,中间经过全连接层的时候降为352/16=22
经过第二个连接层的时候再转换为352

猜你喜欢

转载自blog.csdn.net/aa2962985/article/details/122746507