Paper Notes: Attention Is All You Need

Attention Is All You Need

2018-04-17 10:35:25 

Introduction: 

  Most of the existing methods for domain translation are based on the encoder-decoder framework, and the framework to achieve top performance is also the idea of ​​RNN + Attention Mechanism. However, this article is ingenious and can achieve good performance only by relying on the attention mechanism, and this method is not suitable for parallelization.

  

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324619289&siteId=291194637