神经网络机器翻译模型介绍

以下是这个系列会引用的文献:

References:

[1] Google’s Neural Machine Translation System:Bridging the Gap between Human and Machine Translation, Yonghui Wu, MikeSchuster, Zhifeng Chen, Quoc V. Le, Mohammad Norouzi, Wolfgang Macherey, MaximKrikun, Yuan Cao, Qin Gao, Klaus Macherey, Jeff Klingner, Apurva Shah, MelvinJohnson, Xiaobing Liu, Łukasz Kaiser, Stephan Gouws, Yoshikiyo Kato, Taku Kudo,Hideto Kazawa, Keith Stevens, George Kurian, Nishant Patil, Wei Wang, CliffYoung, Jason Smith, Jason Riesa, Alex Rudnick, Oriol Vinyals, Greg Corrado,Macduff Hughes, Jeffrey Dean. Technical Report, 2016.

[2] P. Brown, S. Della Pietra, V. Della Pietra, and R.Mercer (1993). The mathematics of statistical machine translation: parameterestimation. Computational Linguistics, 19(2), 263-311.

[3] K. Papineni, S. Roukos, T. Ward and W. Zhu 2002.BLEU: a Method for Automatic Evaluation of machine translation. Proc. of the40th Annual Conf. of the Association for Computational Linguistics (ACL 02),pp. 311-318, Philadelphia, PA

[4] https://research.googleblog.com/2016/09/a-neural-network-for-machine.html

[5] Jonas Gehring, Michael Auli, David Grangier, DenisYarats, and Yann N. Dauphin. Convolutional sequence to sequence learning. arXivpreprint arXiv:1705.03122v2, 2017.

[6] https://github.com/facebookresearch/fairseq-py

[7] Ashish Vaswani, Noam Shazeer, Niki Parmar, JakobUszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin.Attention Is All You Need. arXiv:1706.03762,2017.

[8] Zhouhan Lin, Minwei Feng, Cicero Nogueira dos Santos,Mo Yu, Bing Xiang, Bowen Zhou, and Yoshua Bengio. A structured self-attentivesentence embedding. arXiv preprint arXiv:1703.03130, 2017.

[9] Minh-Thang Luong,Hieu Pham, and Christopher D Manning. Effective approaches to attention- basedneural machine translation. arXiv preprint arXiv:1508.04025, 2015.

[10] https://github.com/tensorflow/tensor2tensor


猜你喜欢

转载自blog.csdn.net/mudongcd0419/article/details/78352950