Zatan | distillation of current knowledge and transfer learning which of the available open source tools?

All CSDN users to vote have participated in lottery activities

Announced the award, presented within the group there are more benefits

Author & editor | There are three words

Source |  Three AI (ID: yanyousan_ai )

REVIEW distilled knowledge and transfer learning not only belongs to one of the important model optimization, but also enhance the important cross-cutting technology model generalization, then what is currently available knowledge distillation and transfer learning open source tools?

1 PaddleSlim

PaddleSlim model is proposed Optimizer Baidu contained in PaddlePaddle framework supports several algorithms knowledge distillation may be added in combination loss teacher and student networks of any network layer, comprising FSP loss, L2 loss, softmax with cross-entropy loss and the like.

https://github.com/PaddlePaddle/models/tree/develop/PaddleSlim

 

2 Distiller

Distiller is Intel optimization tools Pytorch open source model, support Knowledge distillation Hinton, who proposed algorithm.

https://github.com/NervanaSystems/distiller

 

3 MxNet

MxNet is integrated Bayesian Dark Knowledge reproduction method, have a role in the recommendation system.

https://github.com/apache/incubator-mxnet/tree/master/example/bayesian-methods

 

4 unofficial Pytorch project

Knowledge-Distillation-Zoo is a GitHub user AberHu distilled knowledge distillation projects, such as support for multiple models fitnet.

https://github.com/AberHu/Knowledge-Distillation-Zoo

deep-transfer-learning是GitHub用户easezyc整理的迁移学习项目。

https://github.com/easezyc/deep-transfer-learning

 

5 一些经典模型实现

Hinton等人在2015年提出了经典的知识蒸馏框架Knowledge distillation,下面是各大框架的复现。

[1] Keras 

https://github.com/TropComplique/knowledge-distillation-keras

[2] Tensorflow
https://github.com/DushyantaDhyani/kdtf

[3] Caffe

https://github.com/wentianli/knowledge_distillation_caffe

更多的一些经典算法如下,留待读者自己学习,咱们就不沉迷于收藏了。

[1] Relational Knowledge Distillation算法

https://github.com/lenscloth/RKD

[2] Teacher-Assistant-Knowledge-Distillation算法

https://github.com/imirzadeh/Teacher-Assistant-Knowledge-Distillation

[3] Contrastive Representation Distillation算法

https://github.com/HobbitLong/RepDistiller

[4] Zero-shot_Knowledge_Distillation算法

https://github.com/sseung0703/Zero-shot_Knowledge_Distillation

[5] net2net算法

https://github.com/soumith/net2net.torch

[6] fitnet算法

https://github.com/adri-romsor/FitNets

 

6 开源综述资料

接下来再给大家介绍两个综述性质的开源资料。

第一个是知识蒸馏综述项目,包括相关的论文合集,也包括一些开源算法的实现。

https://github.com/dkozlov/awesome-knowledge-distillation

第二个是迁移学习综述项目,包括相关的论文合集,开源代码。

https://github.com/jindongwang/transferlearning

其他类似的还有https://github.com/artix41/awesome-transfer-learning,https://github.com/AI-ON/Multitask-and-Transfer-Learning,资料收集爱好者不妨关注。

 

7 更多理论学习

如果想要系统性学习模型优化相关的理论,可以移步有三AI知识星球 -> 网络结构1000变 -> 模型压缩板块 -> 模型剪枝,量化与蒸馏板块,知识蒸馏与迁移学习的一些解读案例如下:

总结

知识蒸馏与迁移学习相比于大量数据进行监督学习的方式,更符合人类学习的特点,在工业界有着实际落地需求,学术界有很重要的研究意义。

(*本文为AI科技大本营转载文章,转载请联系原作者)

精彩推荐

点击阅读原文,或扫描文首贴片二维码

所有CSDN 用户都可参与投票活动

加入福利群,每周还有精选学习资料、技术图书等福利发送

点击投票页面「讲师头像」,60+公开课免费学习

推荐阅读

    你点的每个“在看”,我都认真当成了AI

发布了1284 篇原创文章 · 获赞 1万+ · 访问量 525万+

Guess you like

Origin blog.csdn.net/dQCFKyQDXYm3F8rB0/article/details/103951803