安全性令牌
计算机科学
代表(政治)
人工智能
自然语言处理
嵌入
模式识别(心理学)
语言模型
任务(项目管理)
答疑
多标签分类
序列(生物学)
政治
生物
遗传学
经济
管理
法学
计算机安全
政治学
作者
Naiyin Liu,Qianlong Wang,Jiangtao Ren
标识
DOI:10.1007/s11063-020-10411-8
摘要
Multi-label text classification is a critical task in natural language processing field. As the latest language representation model, BERT obtains new state-of-the-art results in the classification task. Nevertheless, the text classification framework of BERT neglects to make full use of the token-level text representation and label embedding, since it only utilizes the final hidden state corresponding to CLS token as sequence-level text representation for classification. We assume that the finer-grained token-level text representation and label embedding contribute to classification. Consequently, in this paper, we propose a Label-Embedding Bi-directional Attentive model to improve the performance of BERT’s text classification framework. In particular, we extend BERT’s text classification framework with label embedding and bi-directional attention. Experimental results on the five datasets indicate that our model has notable improvements over both baselines and state-of-the-art models.
科研通智能强力驱动
Strongly Powered by AbleSci AI