计算机科学
短语
人工智能
卷积神经网络
编码器
自然语言处理
相关性(法律)
嵌入
模式识别(心理学)
政治学
操作系统
法学
作者
Xue Cheng,Chen Zhang,Qingxu Li
出处
期刊:Journal of physics
[IOP Publishing]
日期:2021-08-01
卷期号:1993 (1): 012038-012038
被引量:2
标识
DOI:10.1088/1742-6596/1993/1/012038
摘要
Abstract This paper is aimed at traditional word embedding models and Bidirectional Encoder Representations from Transformers (BERT) that cannot learn text semantic knowledge, as well as convolutional neural network (CNN) and Bidirectional long short-term memory (BiLSTM) unable to distinguish the importance of words, proposing an improved Chinese short text classification method based on ERNIE_BiGRU model. Firstly, learning text knowledge and information through the Enhanced Representation through Knowledge Integration (ERNIE) enhances the model’s semantic representation capabilities. Secondly, considering that CNN can only extract local features of the text while ignoring the semantic relevance between contextual information, and the Bidirectional Gating Recurrent Unit (BiGRU) is simpler, has fewer network parameters and faster calculation speed than the BiLSTM, the combination of CNN and BiGRU enables the model to capture both local phrase-level features and contextual structure information. Finally, according to the importance of features, the attention mechanism is used to assign different weights to improve the classification effect of the model. The experimental results show that the ERNIE_CNN_BiGRU_Attention (ECBA) model used in this paper has achieved good results in the task of Chinese short text classification.
科研通智能强力驱动
Strongly Powered by AbleSci AI