Transfer Learning、Finetuning和Pre-training

Table of contents

1. Finetuning和Pre-training

2. Transfer Learning

 

1. Finetuning和Pre-training

in conclusion,

  • Pretraining: There is a task 1 , use a large data set A to train the model, and save the trained weight W
  • Finetuning: task 2 is similar to task 1 , use W as the initialization or feature extractor of task 2 , and train a new data set B

Reference link: https://www.jianshu.com/p/330ee6e7ceda

2. Transfer Learning

"In fact, transfer learning is equivalent to pre-training + fine-tuning"

This part of the content mainly refers to https://blog.csdn.net/weixin_43283397/article/details/104682811 . Now, I will summarize the knowledge points that I think are more important based on the reference link, as follows:

定义:Ability of a system to recognize and apply knowledge and skills learned in previous domains/tasks to novel domains/tasks

Important concepts:

  • domain: A specific domain at a certain time, such as thyroid ultrasound image analysis and cardiac ultrasound image analysis
  • task: actual task, recognition task and segmentation task

Three W's, What, How, When:

  • What to migrate?
  • How to migrate?

       -Example based

       - Feature based

       -Based on shared parameters

  • When to migrate?

Two key factors affect transfer learning:

  • Optimization Difficulties Caused by Network Disconnection
  • The performance loss caused by the transfer process of specific features represented in the high-level network

Which of the two factors has the greatest impact depends on the position of the transfer feature, whether it is the bottom, middle or top of the network.

 

I will continue to record and share the good learning materials I see later!


Guess you like

Origin blog.csdn.net/weixin_41698730/article/details/118001276