计算机科学
对偶(语法数字)
图形
人工智能
对偶图
背景(考古学)
自然语言处理
情绪分析
理论计算机科学
平面图
语言学
古生物学
哲学
生物
作者
Yunyi Jia,Xiao-Ming Wu,Xiangzhi Liu
标识
DOI:10.1109/icassp48485.2024.10448386
摘要
Predicting the sentiment polarity of aspect terms in sentences is the goal of Aspect-Based Sentiment Analysis(ABSA) task. Graph Convolutional Network(GCN) is used in majority of the ABSA task due to its ability to effectively capture the dependencies among words or entities within sentences. However, it may not work as expected when some sentences have no obvious syntactic structure. To alleviate this issue, we propose a Context-guided and Syntactic Augmented Dual Graph Convolutional Network(CSADGCN) model for the ABSA task. Specifically, we propose a context-guided attention mechanism that captures both global and local information by combining self-attention and aspect-level attention, even though some sentences have no obvious syntactic structure. In addition,we augment the GCN with multiple linguistic features and utilize a biaffine attention module to capture the relationship between words. On three datasets, extensive experimental modifications reveal that our CSADGCN model performs better than the most recent baseline approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI