The multilingual neural machine translation method in the paper (multilingual neural machine translation)

The general neural machine translation model (NMT) realizes the translation from one language to another, that is, it is used between specific language pairs. Recently, some work has begun to extend the general NMT to multilingual scenarios. At present, two common approaches are: 1. Prepare a separate encoder for each source language, prepare a decoder for each target language, and then share an attention layer between the encoder and the decoder. 2. Regardless of how many languages ​​on the source and target, one encoder, one decoder, and then attention are unified.

1.《Toward Multilingual Neural Machine Translation with Universal Encoder and Decoder》:一个encoder-decoder

In the case of low-resource parallel corpus (En->De): fusion of monolingual corpus (De->De) or other low-resource parallel corpus (such as Fr->De) can improve the model effect.

In the case of fusion of a large number of monolingual corpus: the model effect can also be improved.

Zero resources: there are parallel corpora of De->En, En->Fr, and no parallel corpus of De->Fr alone. Check out the translation effect of De->Fr.

An arrow (->) represents a model, for example, Direct (De->Fr) is to train one model with parallel corpus, Pivot (De->En->Fr) is to train two models, and En is used as the intermediate language. Found that the effect is not good.

2. 《Multilingual NMT with a language-indepent attention bridge》(2019ACL)独立的encoder,decoder+share attention

 

 

Guess you like

Origin blog.csdn.net/Answer3664/article/details/103395479