RNN language model and sequence generation

  1. The language model probability of a particular sentence appears telling you how much.
  2. In order to establish a good RNN model needs to include a large corpus of training set.
  3. Each word transferred to the one-hot vector, including punctuation marks and end marks, no word as an input.
  4. Enter the first step is the zero vector time to do a sorftmax, output probabilities of all dictionary words. After each step the input is a word one-hot, the output of the next word probability. Summing all output cross entropy, then back propagation.
  5. The multiplied output of whole sentences.

 

Guess you like

Origin www.cnblogs.com/biwangwang/p/11431843.html