I love the natural language processing bert ner chinese

BERT relevant papers, articles and code resources Summary

BERT is too recent fire, rub hot spots, sort out related resources, including the interpretation of Paper, code and articles.

1, Google official:

1) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

It all started in October of this Paper resorted to Google, and instantly set off the entire AI circle including from media circles:  https://arxiv.org/abs/1810.04805

2) Github: https://github.com/google-research/bert

November, Google introduced a code and pre-training model, causing groups of excitement again.

3) Google AI Blog: Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing

2, third-party reading:
1) interpretation of Dr. Zhang Junlin, know almost Rubric: From Word Embedding to Bert model - the history of pre-trained technical development of natural language processing

We reprint the article and Dr. Zhang Junlin PPT share on AINLP micro-channel public number, welcome attention:

2)  know almost: How to evaluate BERT model?

3)  [NLP] Google BERT Detailed

4)  [NLP Natural Language Processing] Google BERT model depth analysis

5) BERT Explained: State of the art language model for NLP

6)  BERT Introduction

7)  Reading the paper: BERT model and fine-tuning

8)  NLP breakthroughs BERT model detailed interpretation

9)  Dry | BERT fine-tune the ultimate hands-on tutorials : singularity smart BERT combat tutorial, a training model 79+ in AI Challenger 2018 reading comprehension tasks.

10) 【BERT详解】《Dissecting BERT》by Miguel Romero Calvo
Dissecting BERT Part 1: The Encoder
Understanding BERT Part 2: BERT Specifics
Dissecting BERT Appendix: The Decoder

. 11) BERT + BiLSTM of CRF-do-ner for identifying the NER

12) AI enabling laws | NLP strongest Google BERT Practice of the model in the field of intelligent justice

3, third-party code:

1) pytorch-pretrained-BERT:  https://github.com/huggingface/pytorch-pretrained-BERT
Google officially recommended PyTorch BERB version implements, Google can be loaded model pre-trained: PyTorch version of Google AI's BERT model with script to load Google's pre-trained models

2) BERT-pytorch:  https://github.com/codertimo/BERT-pytorch
another Pytorch version implements: Google AI 2018 BERT pytorch implementation

3) BERT-tensorflow: https://github.com/guotong1988/BERT-tensorflow
Tensorflow版本:BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

4) bert-chainer: https://github.com/soskek/bert-chainer
Chanier版本: Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"

. 5) BERT-AS-Service-:  https://github.com/hanxiao/bert-as-service
the sentences of different lengths with pre-trained BERT coding model, mapped onto a fixed-length vector: Mapping a variable-length sentence to a fixed-length vector using pretrained BERT model
this is very interesting, a little further on this basis whether to make a sentence similarity computing services? There are no students to try?

6) bert_language_understanding: https://github.com/brightmart/bert_language_understanding
BERT实战:Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN

7) sentiment_analysis_fine_grain:  https://github.com/brightmart/sentiment_analysis_fine_grain
BERT combat, multi-label text categorization, attempts at AI Challenger 2018 fine-grained sentiment analysis task: Multi-label Classification with BERT; Fine Grained Sentiment Analysis from AI challenger

8) BERT-NER:  https://github.com/kyzhouhzau/BERT-NER
BERT combat, named entity recognition: Use google BERT to do CoNLL- 2003 NER!

9) BERT-keras: https://github.com/Separius/BERT-keras
Keras版: Keras implementation of BERT with pre-trained weights

10) tbert: https://github.com/innodatalabs/tbert
PyTorch port of BERT ML model

11) BERT-Classification-Tutorial: https://github.com/Socialbird-AILab/BERT-Classification-Tutorial

12) BERT-BiLSMT-CRF-NER: https://github.com/macanv/BERT-BiLSMT-CRF-NER
Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning

13)  BERT-Chinese-Classification Task-
BERT Chinese classification practice

14)  BERT-chinese-NERhttps://github.com/ProHiryu/bert-chinese-ner
use of pre-trained language model BERT do Chinese NER

15)BERT-BiLSTM-CRF-NER
Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning

16) BERT-Sequence-Tagging:  https://github.com/zhpmatrix/bert-sequence-tagging
based Chinese sequence denoted BERT

Guess you like

Origin www.cnblogs.com/jfdwd/p/11232715.html