计算机科学
情绪分析
判决
对偶(语法数字)
极性(国际关系)
人工智能
任务(项目管理)
相关性(法律)
自然语言处理
语言模型
机器学习
语言学
遗传学
生物
哲学
经济
管理
法学
细胞
政治学
作者
Wen Jun Yin,Yangsheng Xu,Cencen Liu,Dongxia Zheng,Zhihui Wang,Chuanjie Liu
标识
DOI:10.1007/978-3-031-44204-9_42
摘要
Aspect-Based Sentiment Analysis (ABSA) is a fine-grained sentiment analysis task that aims to predict sentiment polarity towards a specific aspect occurring in the given sentence. Recently, pre-trained language models such as BERT have shown great progress in this regard. However, due to the mismatch between pre-training and fine-tuning, dealing with informal expressions and complex sentences is facing challenges and it is worthwhile devoting much effort to this. To tackle this, in this paper, we propose a Prompt-oriented Fine-tuning Dual BERT (PFDualBERT) model that considers the complex semantic relevance and the scarce data samples simultaneously. To reduce the impact of such mismatches, we design a ProBERT influenced by the idea of prompt Learning. Specifically, we design a SemBERT module to capture semantic correlations. We refit SemBERT with aspect-based self-attention. The experimental results on three datasets certify that our PFDualBERT model outperforms state-of-the-art methods, and our further analysis substantiates that our model can exhibit stable performance in low-resource environments.
科研通智能强力驱动
Strongly Powered by AbleSci AI