[C5] Sequence Models

A first circumferential loop sequence model (Recurrent Neural Networks)

Why series model? (Why Sequence Models?)

Mathematical symbols (Notation)

Recurrent Neural Networks (Recurrent Neural Network Model)

Reverse propagation time (Backpropagation through time)

Different types of neural network cycle (Different types of RNNs)

Language model and sequence generation (Language model and sequence generation)

New sample sequence (Sampling novel sequences)

Gradient cycle neural networks disappear (Vanishing gradients with RNNs)

GRU单元(Gated Recurrent Unit(GRU))

Short and long term memory (LSTM (long short term memory) unit)

Bidirectional Recurrent Neural Network (Bidirectional RNN)

Deep recurrent neural network (Deep RNNs)

The second week of natural language processing and embedded word (Natural Language Processing and Word Embeddings)

Lexical Representation (Word Representation)

The word embedded (Using Word Embeddings)

Words embedding properties (Properties of Word Embeddings)

Embedded matrix (Embedding Matrix)

Learning words embedded (Learning Word Embeddings)

Word2Vec

Negative sample (Negative Sampling)

GloVe word vector (GloVe Word Vectors)

Emotion category (Sentiment Classification)

Bias embedded word (Debiasing Word Embeddings)

The third week series model and attention mechanisms (Sequence models & Attention mechanism

The base model (Basic Models)

Select the most likely sentence (Picking the most likely sentence)

Bleu score (elective) (Bleu Score (optional))

Attention Model intuitive understanding (Attention Model Intuition)

Attention Model (Attention Model)

Speech recognition (Speech recognition)

Trigger word detection (Trigger Word Detection)

Conclusions and Acknowledgments (Conclusion and thank you)

Guess you like

Origin www.cnblogs.com/keyshaw/p/11027707.html