计算机科学
语法
安全性令牌
人工智能
自然语言处理
变压器
编码器
抽象语法
情绪分析
代表(政治)
依赖关系(UML)
操作系统
政治
物理
量子力学
电压
法学
计算机安全
政治学
作者
Xiaosai Huang,Jing Li,Jia Wu,Jun Chang,Donghua Liu,Kai Zhu
标识
DOI:10.1016/j.ipm.2023.103630
摘要
Aspect-based sentiment analysis (ABSA) refers to ascertaining the propensity of sentiment expressed in a text towards a particular aspect. While previous models have utilized dependency graphs and GNNs to facilitate information exchange, they face challenges such as smoothing of aspect representation and a gap between word-based dependency graphs and subword-based BERT. Taking into account the above deficiencies, we argue for a new approach called SRE-BERT that flexibly utilizes syntax knowledge to enhance aspect representations by relying on syntax representations. First, we propose a syntax representation encoder to acquire the syntactic vector for each token. Then, we devise a syntax-guided transformer that employs syntax representation to compute multi-head attention, thereby enabling direct syntactic interaction between any two tokens. Finally, the token-level vectors derived from the syntax-guided transformer are employed to enhance the semantic representations obtained by BERT. In addition, during the aforementioned process, we introduced a Masked POS Label Prediction (MPLP) method to pre-train the syntax encoder. A wide range of studies have been undertaken on data collections covering three distinct fields, and the results indicate that our SRE-BERT outperforms the second-ranked model by 1.97%, 1.55%, and 1.20% on the Rest14, Lap14, and Twitter 3 datasets, respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI