计算机科学
人工智能
联营
超图
文本图
成对比较
图形
卷积神经网络
特征(语言学)
文字嵌入
模式识别(心理学)
嵌入
自然语言处理
理论计算机科学
数学
自动汇总
语言学
哲学
离散数学
作者
keyao wang,hongbing Xia,Yuan Liu
摘要
To address the problem, existing graph neural network-based text classification models can only effectively learn pairwise binary relationships between words while ignoring the multivariate higher-order relationships between phrases and their inadequate representation of semantic information and local features in the text context. This paper proposes a text classification model for hypergraph attention networks with multi-feature fusion. Firstly, this paper learns multivariate higher-order relations between words by introducing a hypergraph structure instead of the original graph structure. Second, this paper constructs sequential hypergraphs, syntactic hypergraphs, and semantic hypergraphs based on textual information to enrich the textual representation of the graph neural network, thus compensating for the inadequate information representation of the graph neural network. A dual graph attention neural network is then used to learn the embedding representation of word nodes in the hypergraph and the embedding representation of relational hyperedges, respectively. At the same time, an attention-based text pooling module is used to extract discriminative and critical word nodes to help the graph neural network capture the deep local feature information of the text so that the model can better represent the text information. Finally, an adaptive fusion method fuses the three different types of text features to generate a final text feature representation for more effective text classification. The accuracy of this paper's approach on the publicly available datasets Ohsumed, R8, R56, MR, and 20NG reached 71.22%, 98.42%, 96.55%, 79.82%, and 88.32%, respectively, and the experimental results all outperformed the compared baseline models.
科研通智能强力驱动
Strongly Powered by AbleSci AI