CIKM 2016 aNMM: Ranking Short Answer Texts with Attention-Based Neural Matching Model

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/yangliuy/article/details/52970327

中文简介:本文针对当前深度学习模型包括基于CNN或者LSTM的模型适用于Answer Sentence Selection这个task时必须额外combine 传统的text matching feature的问题,提出了一个attention based neural matching model。该模型提出使用value-shared weighting scheme以及基于attention scheme学习问题词的重要性。基于标准Benchmark TREC QA data的实验结果证明,该模型不需要额外combine任何text matching的feature也可以得到和之前的深度学习模型以及基于特征工程的方法相当或者更好的排序效果。如果额外combine一个简单的QL的feature,该模型的排序效果可以超过当前Answer Sentence Selection 这个任务state-of-the-art的方法的效果。

论文出处:CIKM'16

英文摘要:As an alternative to question answering methods based on feature engineering, deep learning approaches such as convolutional neural networks (CNNs) and Long Short-Term Memory Models (LSTMs) have recently been proposed for semantic matching of questions and answers. To achieve good results, however, these models have been combined with additional features such as word overlap or BM25 scores. Without this combination, these models perform significantly worse than methods based on linguistic feature engineering. In this paper, we propose an attention based neural matching model for ranking short answer text. We adopt value-shared weighting scheme instead of position-shared weighting scheme for combining different matching signals and incorporate question term importance learning using question attention network. Using the popular benchmark TREC QA data, we show that the relatively simple aNMM model can significantly outperform other neural network models that have been used for the question answering task, and is competitive with models that are combined with additional features. When aNMM is combined with additional features, it outperforms all baselines.

下载链接:http://maroo.cs.umass.edu/pub/web/getpdf.php?id=1240

开源Code Github链接:https://github.com/yangliuy/aNMM-CIKM16

猜你喜欢

转载自blog.csdn.net/yangliuy/article/details/52970327
今日推荐