预训练模型知识收集-持续更新

预训练文本分类地址:

https://www.cnblogs.com/jiangxinyang/p/10241243.html

https://www.cnblogs.com/zhouxiaosong/p/11384197.html

快速适配下游任务的工程代码:https://github.com/jiangxinyang227/bert-for-task

轻量级bert:https://github.com/google-research/ALBERT

https://baijiahao.baidu.com/s?id=1654588517875312379&wfr=spider&for=pc

猜你喜欢

转载自www.cnblogs.com/demo-deng/p/12203239.html