计算机科学
人工智能
知识库
机器学习
强化学习
循环神经网络
人工神经网络
自然语言处理
标识
DOI:10.1016/j.neucom.2020.02.065
摘要
Knowledge Graphs (KGs) have been applied to various application scenarios including Web searching, Q&A, recommendation system, natural language processing and so on. However, the vast majority of Knowledge Bases (KBs) are incomplete, necessitating a demand for KB completion (KBC). Methods of KBC used in the mainstream current knowledge base include the latent factor model, the random walk model and recent popular methods based on reinforcement learning, which performs well in their respective areas of expertise. Recurrent neural network (RNN) and its variants model temporal data by remembering information for long periods, however, whether they also have the ability to use the information they have already remembered to achieve complex reasoning in the knowledge graph. In this paper, we produce a novel framework (ALSTM) based on the Attention mechanism and Long Short-Term Memory (LSTM), which associates structure learning with parameter learning of first-order logical rules in an end-to-end differentiable neural networks model. In this framework, we designed a memory system and employed a multi-head dot product attention (MHDPA) to interact and update the memories embedded in the memory system for reasoning purposes. This is also consistent with the process of human cognition and reasoning, looking for enlightenment for the future in historical memory. In addition, we explored the use of inductive bias in deep learning to facilitate learning of entities, relations, and rules. Experiments establish the efficiency and effectiveness of our model and show that our method achieves better performance in tasks which include fact prediction and link prediction than baseline models on several benchmark datasets such as WN18RR, FB15K-237 and NELL-995.
科研通智能强力驱动
Strongly Powered by AbleSci AI