Blog address: http: //blog.csdn.net/wangxinginnlp/article/details/52944432
Tool Name: T2T: Tensor2Tensor Transformers
Address: https: //github.com/tensorflow/tensor2tensor
Language: Python / Tensorflow
Description: ★★★★★ five stars
https://research.googleblog.com/2017/06/accelerating-deep-learning-research.html
Tool Name: dl4mt
Address: https: //github.com/nyu-dl/dl4mt-tutorial/tree/master/session2
Language: Python / Theano
Summary:
Attention-based encoder-decoder model for machine translation.
Dr. New York University Kyunghyun Cho group of developers.
Tool Name: blocks
Address: https: //github.com/mila-udem/blocks
Language: Python / Theano
Summary:
Blocks is a framework that helps you build neural network models on top of Theano.
Université de Montréal LISA Lab (Laboratory Director Yoshua Bengio, laboratory now known as MILA Lab, home page: https: //mila.umontreal.ca/en/) development, before GroundHog (https://github.com/lisa -groundhog / GroundHog) upgrade alternative version.
Tool Name: EUREKA-MangoNMT
Address: https: //github.com/jiajunzhangnlp/EUREKA-MangoNMT
Language: C ++
简介:A C++ toolkit for neural machine translation for CPU.
Dr. Zhang Jiajun Chinese Academy of Sciences Institute of Automation, Speech and Language Technology Research Group (http://www.nlpr.ia.ac.cn/cip/jjzhang.htm) development.
Tool Name: Nematus
Address: https: //github.com/EdinburghNLP/nematus
Language: Python / Theano
Description: The University of Edinburgh published NMT tool
Tool Name: AmuNMT
Address: https: //github.com/emjotde/amunmt
Language: C ++
Summary:
A C++ inference engine for Neural Machine Translation (NMT) models trained with Theano-based scripts from Nematus (https://github.com/rsennrich/nematus) or DL4MT (https://github.com/nyu-dl/dl4mt-tutorial).
Moses Machine Translation CIC company Dr. Hieu Hoang (http://statmt.org/~s0565741/), who developed.
Tool Name: Zoph_RNN
Address: https: //github.com/isi-nlp/Zoph_RNN
Language: C ++
Summary:
A C++/CUDA toolkit for training sequence and sequence-to-sequence models across multiple GPUs.
USC Information Sciences Institute development.
Tool Name: sequence-to-sequence mdoels in tensorflow
Address: https: //www.tensorflow.org/versions/r0.11/tutorials/seq2seq/index.html
Language: TensorFlow / Python
简介:Sequence-to-Sequence Models
Tool Name: nmt_stanford_nlp
Address: http: //nlp.stanford.edu/projects/nmt/
Language: Matlab
Summary:
Neural machine translation (NMT) at Stanford NLP group.
Tool Name: OpenNMT
Address: http: //opennmt.net/
Languages: Lua / Torch
Summary:
OpenNMT was originally developed by Yoon Kim and harvardnlp.
Tool Name: lamtram
Address: https: //github.com/neubig/lamtram
Language: C ++ / DyNet
Summary:
lamtram: A toolkit for language and translation modeling using neural networks.
Dr. CMU Graham Neubig group of developers.
Tool Name: Neural Monkey
Address: https: //github.com/ufal/neuralmonkey
Language: TensorFlow / Python
简介:The Neural Monkey package provides a higher level abstraction for sequential neural network models, most prominently in Natural Language Processing (NLP). It is built on TensorFlow. It can be used for fast prototyping of sequential models in NLP which can be used e.g. for neural machine translation or sentence classification.
Institute of Formal and Applied Linguistics at Charles University 开发。
(WMT in NEURAL MT TRAINING TASK use is Neural Monkey See: http: //www.statmt.org/wmt17/)
Tool Name: Neural Machine Translation (seq2seq) Tutorial
Address: https: //github.com/tensorflow/nmt
Language: python / Tensorflow
Summary:
Google Brain's Dr. Luong Thang who produced
If you are interested in this tool, you can use the bilingual corpus WMT16 ran the play, corpus address http://www.statmt.org/wmt16/translation-task.html.
---------------------
Author: warrioR_wx
Source: CSDN
Original: https: //blog.csdn.net/wangxinginnlp/article/details/52944432
Disclaimer: This article as a blogger original article, reproduced, please attach Bowen link!