Tensorflow implementation of SE Block for one-dimensional sequences (1D)

Squeeze-and-Excitation (SE) module or SEnet

def SE_Block(input_tensor,ratio = 16):
    input_shape = K.int_shape(input_tensor)
    squeeze = tf.keras.layers.GlobalAveragePooling1D()(input_tensor)
    excitation = tf.keras.layers.Dense(units = input_shape[-1]//ratio, kernel_initializer='he_normal',activation='relu')(squeeze)
    excitation = tf.keras.layers.Dense(units = input_shape[-1],activation='sigmoid')(excitation)
    #excitation = tf.reshape(excitation, [-1, 1, input_shape[-1]])
    scale = tf.keras.layers.Multiply()([input_tensor, excitation])
    return scale

#X = tf.random.uniform((32, 352))
X = tf.keras.Input(shape=(32,352))
Y = SE_Block(X,16)
model = tf.keras.Model(inputs=[X], outputs=[Y])
model.summary()
print(Y.shape)

The SE Block input in one-dimensional case
is (Batch size, squence length, num of channels)
ratio is the compression rate, here is 16.
For example, the input is 352 channels, which is reduced to 352/16=22 when passing through the fully connected layer in the middle, and
then converted to 352 when passing through the second connected layer.

Guess you like

Origin blog.csdn.net/aa2962985/article/details/122746507