ALSTM: An attention-based long short-term memory framework for knowledge base reasoning

计算机科学 人工智能 知识库 机器学习 强化学习 循环神经网络 人工神经网络 自然语言处理
作者
Qi Wang,Yongsheng Hao
出处
期刊:Neurocomputing [Elsevier BV]
卷期号:399: 342-351 被引量:20
标识
DOI:10.1016/j.neucom.2020.02.065
摘要

Knowledge Graphs (KGs) have been applied to various application scenarios including Web searching, Q&A, recommendation system, natural language processing and so on. However, the vast majority of Knowledge Bases (KBs) are incomplete, necessitating a demand for KB completion (KBC). Methods of KBC used in the mainstream current knowledge base include the latent factor model, the random walk model and recent popular methods based on reinforcement learning, which performs well in their respective areas of expertise. Recurrent neural network (RNN) and its variants model temporal data by remembering information for long periods, however, whether they also have the ability to use the information they have already remembered to achieve complex reasoning in the knowledge graph. In this paper, we produce a novel framework (ALSTM) based on the Attention mechanism and Long Short-Term Memory (LSTM), which associates structure learning with parameter learning of first-order logical rules in an end-to-end differentiable neural networks model. In this framework, we designed a memory system and employed a multi-head dot product attention (MHDPA) to interact and update the memories embedded in the memory system for reasoning purposes. This is also consistent with the process of human cognition and reasoning, looking for enlightenment for the future in historical memory. In addition, we explored the use of inductive bias in deep learning to facilitate learning of entities, relations, and rules. Experiments establish the efficiency and effectiveness of our model and show that our method achieves better performance in tasks which include fact prediction and link prediction than baseline models on several benchmark datasets such as WN18RR, FB15K-237 and NELL-995.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
wos完成签到,获得积分10
刚刚
shisanjie发布了新的文献求助10
刚刚
无极微光应助清塵采纳,获得20
1秒前
1秒前
yumi完成签到 ,获得积分10
1秒前
1秒前
GL完成签到,获得积分10
1秒前
沐沐完成签到,获得积分10
1秒前
1秒前
彭于晏应助雾梦采纳,获得10
1秒前
wanci应助rorolinlin采纳,获得10
1秒前
浮游应助zoeeeey采纳,获得10
2秒前
4秒前
4秒前
zyzoo发布了新的文献求助10
4秒前
董绮敏发布了新的文献求助10
5秒前
5秒前
5秒前
万坤完成签到,获得积分10
5秒前
6秒前
ruanyh完成签到,获得积分10
6秒前
幼兰呆鹅完成签到,获得积分10
6秒前
YANYAN完成签到,获得积分10
6秒前
dr.du完成签到 ,获得积分10
6秒前
wyl完成签到,获得积分20
6秒前
耍酷的花卷完成签到 ,获得积分10
6秒前
帅气的plum发布了新的文献求助10
7秒前
jammszs完成签到,获得积分10
7秒前
冉柒发布了新的文献求助10
7秒前
qyy完成签到,获得积分10
8秒前
czcz完成签到,获得积分10
8秒前
8秒前
oi发布了新的文献求助10
8秒前
光锥完成签到,获得积分10
8秒前
空中风也发布了新的文献求助30
8秒前
ZZYY发布了新的文献求助10
9秒前
情怀应助JJ采纳,获得10
9秒前
9秒前
pan完成签到,获得积分10
9秒前
情怀应助直率青亦采纳,获得30
9秒前
高分求助中
美国药典 2000
Fermented Coffee Market 2000
合成生物食品制造技术导则,团体标准,编号:T/CITS 396-2025 1000
The Leucovorin Guide for Parents: Understanding Autism’s Folate 1000
Pipeline and riser loss of containment 2001 - 2020 (PARLOC 2020) 1000
Critical Thinking: Tools for Taking Charge of Your Learning and Your Life 4th Edition 500
Comparing natural with chemical additive production 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 5238818
求助须知:如何正确求助?哪些是违规求助? 4406474
关于积分的说明 13714044
捐赠科研通 4274861
什么是DOI,文献DOI怎么找? 2345780
邀请新用户注册赠送积分活动 1342825
关于科研通互助平台的介绍 1300786