First acquaintance - principle of recurrent neural network (RNN) word2vec principle (1) CBOW and Skip-Gram model basis

 

Main applications: machine translation, natural language processing, text processing, speech recognition, image description generation (Generating Image Descriptions), image question answering QA....

 

1. How did RNN come about?
2. The network structure and principle of RNN

write picture description here
3. Improvement of RNN 1: Bidirectional RNN
4. Improvement of RNN 2: Deep Bidirectional RNN
4.1 Pyramidal RNN
5. Training of RNN - BPTT
6. Combined application of RNN and CNN: see pictures and talk
7. RNN project practice

       This series will implement a recurrent neural network based language model . The implementation includes two aspects: one is to get the score that any sentence is true in reality, which provides a measure for judging the correctness of grammar and semantics. This model is a typical application in machine translation. The second is that the model can generate new text, which is a great application. For example, training on Shakespeare's articles can generate a new Shakespeare-like text. At present, this interesting idea has been implemented by Andrew Karpathy's character-level language model based on RNNs

word2vec principle (1) CBOW and Skip-Gram model basis

  1. Word2Vec Tutorial—The Skip-Gram Model
  2. Word Embedding Explained and Visualized
  3. Vector Representation of Words

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324898189&siteId=291194637