YOLOv5 Improvement | 2023 Attention Chapter | iRMB Inverted Residual Block Attention Mechanism (Lightweight Attention Mechanism)

 1. Introduction to this article

The improvement mechanism this article brings to everyone is iRMB , which is proposed in the paper Rethinking Mobile Block for Efficient Attention-based Models. The paper proposes a new backbone network EMO (I will also teach you how to use this backbone later. This article Let’s first teach you how to use the attention mechanism proposed in this article). The main idea is to combine a lightweight CNN architecture with an attention-based model structure (somewhat similar to ACmix) . I combined iRMB with C2f, and then also used it to try to detect the first species . Three results were obtained. In comparison, the effects are also different, but no matter what kind of experiment, there is a certain improvement effect . At the same time, the attention mechanism is relatively lightweight, the number of parameters is relatively small, and the training speed is also very fast. I will add various additions later. The method is taught to everyone so that everyone can reproduce it in their own models.

Recommendation index: ⭐⭐⭐⭐⭐

Guess you like

Origin blog.csdn.net/java1314777/article/details/135443387