NLP feature processor Transformer and his history

 

RNN CNN  become old. To embrace Transformer.

For the field of natural language processing, the 2018 is undoubtedly a rewarding year,

Is Bert the model.

One is  Bert such a two-phase mode (pre-training + Finetuning)

NLP research and will become a popular way for industrial applications;

The second is the extractor from the standpoint of the characteristic field of NLP,

Transformer will gradually replace RNN become the most mainstream of feature extractor.

RNN sector since the introduction of NLP, NLP is widely used in a variety of tasks. But the original RNN is also problematic.

It takes linear sequence structure continues to gather input from front to back, but this time the presence of the counter-propagating configuration optimization difficulties,

Because the back-propagation path is too long, easily lead to severe gradients or gradient disappears explosion problem.

 To solve this problem, and later introduced LSTM and GRU model,

By increasing the intermediate state information dissemination directly backwards, in order to ease the gradient disappearing,

Through continuous optimization,

Later NLP field and from the image and draw attention to introduce the mechanism
and the introduction of Encoder-Decoder framework,

These technological advances greatly expanded capabilities and application effect of RNN.

RNN NLP community has been in the red for many years (2014-2018?), Before 2018, State most of the various sub-fields of Art are RNN results obtained.

Another serious impediment to future RNN continued popularity of the question is:

RNN itself quite unfriendly sequence-dependent configuration for massively parallel computing is.

Popular point that is difficult to have RNN efficient parallel computing power

That question arises: Why RNN parallel computing capacity is relatively poor? It is what causes?

 

 

 

references

New Ji-won "Transformer dominate the political arena: natural language processing feature extraction of three comparator"

 

Guess you like

Origin www.cnblogs.com/pocahontas/p/11334364.html