VisionTransformer[VIT],DETR

Check out the teacher's video and blog method to go through the details 

content

VisionTransformer[VIT]

DeTR


VisionTransformer[VIT]

 

 (212 messages) ViT full process notes, with detailed code explanation. _AI Studio's blog - CSDN blog _vit code

 thinking about

 

 

Model interpreting go to these reference links to learn

Inductive bias

 

 

 

 

 

 

 

 

 

 

  

 

 

MAE_link:

 

 

 

 

 

 

Self-pretrained method don't need people to label dataset ,that's  an effervescent work.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  

 

 

DeTR

Detr  and vit

 What is the difference and connection between detr and vit, can they be integrated together? - Know the difference and connection between detr and vit, can you integrate them together? - Know almost

 
 

References

Vision Transformer (ViT) PyTorch code full analysis (with illustration) - Adenialzz's blog - CSDN blog - vit-pytorch

// simple version 

(212 messages) ViT: Visual Transformer backbone network ViT paper and code detailed explanation - louwill12's blog - CSDN Blog

 //full version

Neural network learning small record 67 - detailed explanation of the reproduction of the Pytorch version of the Vision Transformer (VIT) model - Programmer Sought

 11.1 Vision Transformer (vit) Network Detailed Explanation CSDN blog  https://blog.csdn.net/lgzlgz3102/article/details/109140622

//Overall supplement, very detailed

Intensive reading of the ViT paper paragraph by paragraph [Intensive reading of the paper] _ beep beep _bilibili  //Supporting video

Intensive reading of ViT thesis paragraph by paragraph [Intensive reading of the paper] - bilibili.com  //Supporting Notes

Explanation of each module of Yolo series

A complete explanation of the core basic knowledge of Yolov5 of the Yolo series - Zhihu (zhihu.com)

// Multiple interaction layers Multi-Head Attention (MSA) and Multi-Layer Perceptron (MLP) 

 Vision MLP Super Detailed Interpretation (Principle Analysis + Code Interpretation) (4) - Knowing (zhihu.com)

(212 messages) ViT full process notes, with detailed code explanation. _AI Studio's blog - CSDN blog _vit code

//encoder, the handwritten explanation is very clear 

(212 messages) BatchNormalization, LayerNormalization, InstanceNorm, GroupNorm, SwitchableNorm Summary - Charlotte's Web Blog - CSDN Blog - layernorm and batchnorm

//Why choose layer normalization 

Guess you like

Origin blog.csdn.net/weixin_43332715/article/details/123810530