计算机科学
人工智能
分类器(UML)
杠杆(统计)
机器学习
学习迁移
深度学习
情绪分析
机制(生物学)
领域(数学分析)
自然语言处理
数学
认识论
数学分析
哲学
作者
Rahul Kumar Singh,Manoj Kumar Sachan,Ram Bahadur Patel
摘要
The purpose of cross-domain opinion classification is to leverage useful information acquired from the source domain to train a classifier for opinion classification in the target domain, which has a huge amount of unlabeled data. An opinion classifier trained on a specific domain usually acts poorly, when directly employed to another domain. Annotating the data for all the domains is a laborious and costly process. The majority of available approaches are centered on identifying invariant features among domains. Unluckily, they are unable to properly capture the context within the sentences and better utilization of unlabeled data. To properly address this issue, we propose an aspect-based attention model for cross-domain opinion classification. By incorporating knowledge of aspects and sentences, the proposed model provides a transfer mechanism for better-transferring opinions among domains. We introduce two learning networks, first learning network aims to recognize the shared features between domains, while the purpose of the second learning network is to extract the information from the aspects by utilizing shared words as a bridge. We benefit from BERT and bidirectional gated recurrent unit to get a deep understanding and deep level semantic information of the text. Further, the joint attention learning mechanism is performed for these two learning modules so that the aspects and sentences can impact the resulting opinion expression. In addition, we introduce a gradient reversal layer to obtain invariance features. The comprehensive experiments are performed on Amazon multidomain product datasets and show the effectiveness and significance of the proposed model over state-of-the-art techniques.
科研通智能强力驱动
Strongly Powered by AbleSci AI