多义
计算机科学
自然语言处理
人工智能
词(群论)
语法
文字蕴涵
语义学(计算机科学)
代表(政治)
深度学习
语言学
逻辑后果
程序设计语言
政治
哲学
法学
政治学
作者
Matthew E. Peters,Mark E Neumann,Mohit Iyyer,Matt Gardner,Christopher Clark,Kenton Lee,Luke Zettlemoyer
出处
期刊:Cornell University - arXiv
日期:2018-02-15
被引量:1139
摘要
We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pre-trained on a large text corpus. We show that these representations can be easily added to existing models and significantly improve the state of the art across six challenging NLP problems, including question answering, textual entailment and sentiment analysis. We also present an analysis showing that exposing the deep internals of the pre-trained network is crucial, allowing downstream models to mix different types of semi-supervision signals.
科研通智能强力驱动
Strongly Powered by AbleSci AI