Classification algorithm (1)-overview
Classification algorithm (two)-FastText (principle introduction)
Classification algorithm (two)-FastText
Classification algorithm (three)-LR NB SVM KNN call example
Classification algorithm (four)-SVM
Classification algorithm (5)-xgboost installation
Classification algorithm (7)-short text classification
My introduction here is still in the machine learning stage. Deep learning algorithms have not been introduced yet (will be added later). Here are some introduction series of deep learning text classification in the blog garden:
I found a very good blog here: text classification practice https://www.cnblogs.com/jiangxinyang/p/10241243.html
word2vec pre-trained word vector
All codes are in the textClassifier repository.
I want to add that before BERT came out, I used the text classification method of RamNet, a paper by Tencent Zhang Tong . The core idea is multiple attention. The effect is equivalent to the BiLSTM+attention effect, and some tasks have been improved. But compared to the appearance of BERT, it looks slightly inferior. However, pre-training models such as BERT are still relatively heavy. We sometimes use BERT to teach RamNet to perform model distillation. I will write a related blog post for a detailed introduction. If you are interested, you can learn about RamNet ( Recurrent Attention Network on Memory for Aspect Sentiment Analysis )