机器翻译
计算机科学
人工智能
判决
任务(项目管理)
布鲁
德国的
自然语言处理
辍学(神经网络)
人工神经网络
翻译(生物学)
词(群论)
机制(生物学)
机器学习
深度学习
循环神经网络
语言学
哲学
生物化学
化学
管理
认识论
信使核糖核酸
经济
基因
作者
Minh-Thang Luong,Hieu Pham,Christopher D. Manning
出处
期刊:Empirical Methods in Natural Language Processing
日期:2015-01-01
被引量:6226
摘要
An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time. We demonstrate the effectiveness of both approaches on the WMT translation tasks between English and German in both directions. With local attention, we achieve a significant gain of 5.0 BLEU points over non-attentional systems that already incorporate known techniques such as dropout. Our ensemble model using different attention architectures yields a new state-of-the-art result in the WMT’15 English to German translation task with 25.9 BLEU points, an improvement of 1.0 BLEU points over the existing best system backed by NMT and an n-gram reranker. 1
科研通智能强力驱动
Strongly Powered by AbleSci AI