ICCV 2023: Exploring Backbone pre-training (DreamTeacher) based on generative models

ICCV 2023: Exploring Backbone pre-training based on generative models

Table of contents

  • Preface
  • Related work
    • Discriminative Representation Learning
    • Generative Representation Learning
  • DreamTeacher framework introduction
    • Unsupervised Representation Learning
    • Label-Guided Representation Learning
  • experiment
  • Summarize
  • reference

Preface

Please add image description

The article we are going to introduce this time was accepted at ICCV 2023, titled: DreamTeacher: Pretraining Image Backbones with Deep Generative Models. I think it is a very strong and interesting self-supervised work. DreamTeacher is used to perform knowledge distillation from the pre-trained generative network to the target image Backbone. As a general pre-training mechanism, no labels are required. In this article we study feature distillation and, where task-specific labels are possible, label distillation. We will introduce these two types of knowledge distillation in detail later.

In fact, a diffusion denoising self-supervised pre-training method has been introduced on GiantPandaCV before: DDeP , the design of DDeP is simple&#x

おすすめ

転載: blog.csdn.net/weixin_43838785/article/details/131941315