计算机科学
利用
理论计算机科学
图形
嵌入
人工智能
计算机安全
作者
Guoquan Dai,Xizhao Wang,Xiaoying Zou,Chao Liu,Si Cen
标识
DOI:10.1016/j.neunet.2022.07.014
摘要
One of the most effective ways to solve the problem of knowledge graph completion is embedding-based models. Graph neural networks (GNNs) are popular and promising embedding models which can exploit and use the structural information of neighbors in knowledge graphs. The current GNN-based knowledge graph completion methods assume that all neighbors of a node have equal importance. This assumption which cannot assign different weights to neighbors is pointed out in our study to be unreasonable. In addition, since the knowledge graph is a kind of heterogeneous graph with multiple relations, multiple complex interactions between nodes and neighbors can bring challenges to the effective message passing of GNNs. We then design a multi-relational graph attention network (MRGAT) which can adapt to different cases of heterogeneous multi-relational connections and then calculate the importance of different neighboring nodes through a self-attention layer. The incorporation of self-attention mechanism into the network with different node weights optimizes the network structure, and therefore, significantly results in a promotion of performance. We experimentally validate the rationality of our models on multiple benchmark knowledge graphs, where MRGAT achieves the best performance on various evaluation metrics including MRR score, Hits@ score compared with other state-of-the-art baseline models.
科研通智能强力驱动
Strongly Powered by AbleSci AI