Roundup: Situation transfer learning development and future trends

the definition of transfer learning

Data
  • Feature space
    consistent source domain and target domain feature space
  • Data availabilit
    training phase, target domain data set is available, the adequacy
  • Balanced data
    between the data amount of each category are balanced
  • Sequential data
    whether the data is sequence data
Label
  • Dostępność the Label
    Source and target domain Domain label is available
  • Label space
    if the category is consistent between the two data sets

Taxonomy Different Views

By Data Distribution
  • Inductive TL
  • Transductive TL
  • Unsupervised TL
By Methodology
  • Instance-based transfer
    -based transfer learning aims specific example of a weighted manner, from the source domain selection portion instance complement target domain data. The method based on the assumption, "although two different domain is the art, but some examples in the Source domain can still be a certain ratio Source domain using, for lifting effect model."


    5005591-8876ff0c44c3cc54.png

    FIG bright blue source domain and target domain Examples Examples are not similar, should be excluded, and the dark blue and target domain are similar, the proportion should be added to certain target domain training.

  • Feature representation based transfer
    • Asymmetric feature-based transfer learning
    • Symmetric feature-based transfer learning
      only partially overlapping features Source domain and target domain (numerous features appear in only one of the domain), characterized by learning a good representation to reduce the deviation between the field and ultimately reduce the learning error
      against based migration depth study of learning methods, aims to find the confrontation learning by introducing technical features suitable for both source domain and target domain representation. The method based on the assumption, "effective for migration, wherein representation is having a good discriminative learning task of the main, for the source domain and the target domain without distinction."


      5005591-6f17a8ef1f19b7dd.png

      Process for training for a large number of data source domain, the front layers of the network can be regarded as feature extractor. The feature extractor extracts features of the two domain, and then enter against network; feature against attempts to distinguish network. If the fight against network features difficult to distinguish, it means that two characteristics distinguish the domain of small, has good mobility, and vice versa.
      In recent years, due to its good performance and practicality, studied a wide range of learning methods based on depth migration confrontation learning.

  • Learning-based Approaches Transfer the Parameter
    target domain and between a source domian tasks share the same model parameters (model parameters) are subject to the same or prior distribution (prior distribution)
    intended to reuse parts of the model pre-trained source domain, the network comprising structure and connection parameters. To train through the migration target domain structure and parameters of partially pre-training model. The method based on the assumption, "similar to the human brain neural network processing mechanism, it is a continuous and iterative process of abstraction. Foregoing layers of the neural network can be seen as feature extractor, which extracts feature is rich and varied."


    5005591-9c09188f8fb0c72f.png
  • Relational Transfer Learning Approaches
    mapped to a new data space by the source domain and the target domain. In this new data space, two examples from similar domain combined depth and suitable for the neural network. The method based on the assumption, "although the source domain and the target domain are not the same, but the new data in well-designed space, they may be more similar."


    5005591-efb35f080624bba9.png
  • Hybrid-based (instance and parameter) transfer learning
Domain characteristics
  • Homogeneous feature spaces and label spaces
    feature the same space and label space, but different data distribution (the last year data, the last three months of data).
    • Labelled Target Dataset
    • Labelled Plus Unlabelled Target Dataset
    • Unlabelled Target Dataset
    • Imbalanced Unlabelled Target Dataset
    • Sequential Labelled Target Data
    • Sequential Unlabelled Target Data
    • Unavailable Target Data
  • Heterogeneous label spaces
    same feature space, different label space (different Task, the same feature space, a different label).
    • Labelled Target Dataset
    • Unlabelled Target Dataset
    • Sequential Labelled Target Data
    • Unavailable Target Data
    • Unlabelled Source Dataset
  • Heterogeneous feature spaces
    the same label space, different feature space (clothing comments, Beauty review, different domain, the same label).
    • Labelled Target Dataset
    • Labelled Plus Unlabelled Target Dataset
    • Unlabelled Target Dataset
  • Heterogeneous feature spaces and label spaces
    feature space and space have a different label (garment industry, beauty industry)
    • Labelled Target Dataset
    • Sequential Labelled Target Data
By Scene and algorithm
  • General Transfer Learning
  • Domain Adaptation
  • Domain Generalization
  • Multi-source Transfer Learning
  • Zero-shot / Few-shot Learning
  • Deep Transfer Learning
  • Multi-task Learning
  • Transfer Reinforcement Learning
  • Transfer Metric Learning
  • Transitive Transfer Learning
  • Lifelong Learning
  • Negative Transfer

Conclusion

Future studies, there are several issues that need to be focused on.
First of all, how to avoid negative transfer is an open question. Most transfer learning algorithm assumes that the source domain and the target domain correlation exists in some scenes; once this assumption does not hold, it is possible to generate a negative migration in the course of the experiment, the result is even worse than the effect of migration does not migrate. So if ensure no negative migration is a very important issue in the field of migration study. To avoid negative migration in the early experiments that we should study source domain, can migrate between the source task, target domain and target task. Based on an appropriate migration test, select the relevant source domain and task knowledge to extract the target task learning. In order to define migration between the domain and the task, we need to define a standard similarity between domain or task. Based measure of the distance, we can be clustered domain or task, it might help bring mobility measurements. A related problem is that if migration is not suitable for the entire domain, whether we can use part of the knowledge of them to migrate?
Secondly, most of the attention when migrating learning algorithms on when the source domain and target domain has a different distribution, how to enhance the generalization of the model. In order to realize this idea, we assume that the characteristics of the space between the source domain and the target domain is the same; however, a large number of practical applications, we might want different features of domain space of migration, we often call this type algorithm is different configuration transfer learning.
Migration study were grouped into four categories: Learning depth migration instance-based, learning-based migration mapping, network-based transfer learning and transfer learning based on confrontation. In most practical applications, the above methods are applied in combination to obtain better performance model. The current migration of a large number of studies have focused on the learning areas have supervised learning, the future may be affected by deep learning more and more attention in unsupervised and semi-supervised areas. In the traditional transfer learning algorithm, the negative migration and migration way to measure learning is a very critical issue, which is that we need to pay more attention to research in depth migration study. It is foreseeable that with the rapid development of deep learning, depth migration on learning will be widely used in a number of challenging issues.

references

  • A Survey on Deep Transfer Learning
  • Domain adversarial neural networks
  • A Survey on Transfer Learning

Guess you like

Origin blog.csdn.net/weixin_34292959/article/details/90892131