cvpr2021 knowledge distillation article review

foreword

This article mainly summarizes some articles about knowledge distillation in cvpr2021, hoping to inspire future work.

Address of all papers: https://openaccess.thecvf.com/CVPR2021 .

Distilling Knowledge via Knowledge Review

Complementary Relation Contrastive Distillation

  • Paper address: https://openaccess.thecvf.com/content/CVPR2021/papers/Zhu_Complementary_Relation_Contrastive_Distillation_CVPR_2021_paper.pdf
  • Code address:-
  • The previous idea of ​​knowledge distillation is based on the distillation of a single sample or the distillation of the similarity between samples. In this paper, the author believes that the relationship between samples is redundant information and can be further distilled. Therefore, CRCD is proposed, and the teacher

おすすめ

転載: blog.csdn.net/u012526003/article/details/121325019