AAAI2021 Distillation paper review

Introduction

  • AAAI paper list: https://aaai.org/Conferences/AAAI-21/wp-content/uploads/2020/12/AAAI-21_Accepted-Paper-List.Main_.Technical.Track_.pdf
  • Overall: Distillation of feature maps is very common. In the past, many distillation methods were manually selected for feature maps. In this aaai2021 paper, there are 3 distillation papers on feature maps, all of which are based on the idea of ​​automatically selecting feature map/attention , it can also be seen a trend: the middle layer features are the hotspots of everyone's research, but how to automatically mine the information of the middle layer features needs to be further explored .

List of papers related to distillation

  • PSSM-Distil: Protein Secondary Structure Prediction (PSSP) on Low-Quality PSSM by Knowledge Distillation with Contrastive Learning

  • LRC-BERT: Latent-Representation Contrastive Knowledge Distillation for Natural Language Understanding

  • Peer Collaborative Learning for Online Knowledge Distillation

  • Cross-Layer Distillation with Semantic Calibration

  • Few-Shot Class-Incremental Learning via Relation Knowledge Distillation

  • Harmoniz

Guess you like

Origin blog.csdn.net/u012526003/article/details/122304444