计算机科学
情绪分析
光学(聚焦)
人工智能
自然语言处理
背景(考古学)
词(群论)
机器学习
语言学
生物
光学
物理
哲学
古生物学
作者
Xi Wang,Mengmeng Fan,Mingming Kong,Zheng Pei
标识
DOI:10.1016/j.knosys.2022.109335
摘要
In Natural Language Processing (NLP), attention mechanism is often used to quantify the importance of the context word in sentiment prediction. However, it tends to focus on high-frequency words, while ignoring low-frequency words that have an active effect in some positions. In this paper, we propose a Sentiment Lexical Strength Enhanced Self-supervised Attention Learning (SLS-ESAL) approach. Specifically, we iteratively mine attention supervision information from all input sentences. Then we use weights quantified by sentiment lexical strength to enhance attention learning in final training, which enables our model to continue to focus on the active context words in different positions and eliminate the effects of the misleading context ones. Experiments on three datasets show that our approach can improve sentiment analysis performance and verify attention weights can be used as an explanation for text classification.
科研通智能强力驱动
Strongly Powered by AbleSci AI