机器翻译
计算机科学
自然语言处理
人工智能
判决
语言模型
翻译(生物学)
编码器
基于实例的机器翻译
德国的
安全性令牌
语言学
操作系统
信使核糖核酸
哲学
基因
生物化学
化学
计算机安全
作者
Melvin Johnson,Mike Schuster,Quoc V. Le,Maxim Krikun,Yonghui Wu,Zhifeng Chen,Nikhil Thorat,Fernanda Viégas,Martin Wattenberg,Greg S. Corrado,Macduff Hughes,Jay B. Dean
出处
期刊:Cornell University - arXiv
日期:2016-11-14
被引量:76
摘要
We propose a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages. Our solution requires no change in the model architecture from our base system but instead introduces an artificial token at the beginning of the input sentence to specify the required target language. The rest of the model, which includes encoder, decoder and attention, remains unchanged and is shared across all languages. Using a shared wordpiece vocabulary, our approach enables Multilingual NMT using a single model without any increase in parameters, which is significantly simpler than previous proposals for Multilingual NMT. Our method often improves the translation quality of all involved language pairs, even while keeping the total number of model parameters constant. On the WMT'14 benchmarks, a single multilingual model achieves comparable performance for English$\rightarrow$French and surpasses state-of-the-art results for English$\rightarrow$German. Similarly, a single multilingual model surpasses state-of-the-art results for French$\rightarrow$English and German$\rightarrow$English on WMT'14 and WMT'15 benchmarks respectively. On production corpora, multilingual models of up to twelve language pairs allow for better translation of many individual pairs. In addition to improving the translation quality of language pairs that the model was trained with, our models can also learn to perform implicit bridging between language pairs never seen explicitly during training, showing that transfer learning and zero-shot translation is possible for neural translation. Finally, we show analyses that hints at a universal interlingua representation in our models and show some interesting examples when mixing languages.
科研通智能强力驱动
Strongly Powered by AbleSci AI