计算机科学
人工智能
自然语言处理
情绪分析
语义学(计算机科学)
卷积神经网络
背景(考古学)
光学(聚焦)
保险丝(电气)
词(群论)
粒度
代表(政治)
操作系统
工程类
程序设计语言
生物
光学
政治学
法学
政治
电气工程
物理
哲学
语言学
古生物学
作者
Chuanbin Wu,Yuwei Zhang,Sijun Lu,Guoyan Xu
标识
DOI:10.1109/iceiec58029.2023.10199931
摘要
Researchers have been paying more and more attention lately to the use of natural language processing methods to analyze the sentiment of short text. Due to the characteristics of sparsity and irregularity, short text classification often has a large number of implied semantics, which requires a high feature learning ability of the model. The current common models, like Convolutional Neural Network (CNN), can extract the local information of sentences but ignore the contextual semantic information between words. Bidirectional Long Short-Term Memory Network (BiLSTM) can fill in the deficiencies left by CNN's inability to effectively extract the contextual semantic information of text, but the local features of sentences cannot be extracted well. In this paper, a text sentiment analysis classification model based on multiple attention mechanisms and TextCNN-BiLSTM is proposed. The dynamic word vector representation of text is obtained by the BERT model, local information with different granularity is extracted in parallel by TextCNN with different scales, The global dependence is then mined using the multi-head self-attention process, which reduces the impact of context-dependent long-distance words. To obtain the prediction result of sentiment analysis, the BiLSTM model is finally input to fuse the time sequence information, the attention mechanism is combined to focus on the crucial data. The experimental results show that the accuracy and F1 are improved on two data sets compared with previously reported models.
科研通智能强力驱动
Strongly Powered by AbleSci AI