计算机科学
嵌入
文字嵌入
背景(考古学)
词(群论)
期限(时间)
领域(数学分析)
人工智能
自然语言处理
卷积神经网络
编码(集合论)
理论计算机科学
数学
程序设计语言
物理
几何学
集合(抽象数据类型)
量子力学
生物
古生物学
数学分析
作者
Jingyun Xu,Jiayuan Xie,Yi Cai,Zehang Lin,Ho-fung Leung,Qing Li,Tat‐Seng Chua
出处
期刊:IEEE Transactions on Affective Computing
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:15 (1): 144-156
被引量:2
标识
DOI:10.1109/taffc.2023.3262941
摘要
The aspect term extraction (ATE) task aims to extract aspect terms describing a part or an attribute of a product from review sentences. Most existing works rely on either general or domain embedding to address this problem. Despite the promising results, the importance of general and domain embeddings is still ignored by most methods, resulting in degraded performances. Besides, word embedding is also related to downstream tasks, and how to regularize word embeddings to capture context-aware information is an unresolved problem. To solve these issues, we first propose context-aware dynamic word embedding (CDWE), which could simultaneously consider general meanings, domain-specific meanings, and the context information of words. Based on CDWE, we propose an attention-based convolution neural network, called ADWE-CNN for ATE, which could adaptively capture the previous meanings of words by utilizing an attention mechanism to assign different importance to the respective embeddings. The experimental results show that ADWE-CNN achieves a comparable performance with the state-of-the-art approaches. Various ablation studies have been conducted to explore the benefit of each component. Our code is publicly available at http://github.com/xiejiajia2018/ADWE-CNN .
科研通智能强力驱动
Strongly Powered by AbleSci AI