Main applications: machine translation, natural language processing, text processing, speech recognition, image description generation (Generating Image Descriptions), image question answering QA....
1. How did RNN come about?
2. The network structure and principle of RNN
3. Improvement of RNN 1: Bidirectional RNN
4. Improvement of RNN 2: Deep Bidirectional RNN
4.1 Pyramidal RNN
5. Training of RNN - BPTT
6. Combined application of RNN and CNN: see pictures and talk
7. RNN project practice
This series will implement a recurrent neural network based language model . The implementation includes two aspects: one is to get the score that any sentence is true in reality, which provides a measure for judging the correctness of grammar and semantics. This model is a typical application in machine translation. The second is that the model can generate new text, which is a great application. For example, training on Shakespeare's articles can generate a new Shakespeare-like text. At present, this interesting idea has been implemented by Andrew Karpathy's character-level language model based on RNNs .
word2vec principle (1) CBOW and Skip-Gram model basis
- Word2Vec Tutorial—The Skip-Gram Model
- Word Embedding Explained and Visualized
- Vector Representation of Words