已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

ALSTM: An attention-based long short-term memory framework for knowledge base reasoning

计算机科学 人工智能 知识库 机器学习 强化学习 循环神经网络 人工神经网络 自然语言处理
作者
Qi Wang,Yongsheng Hao
出处
期刊:Neurocomputing [Elsevier]
卷期号:399: 342-351 被引量:20
标识
DOI:10.1016/j.neucom.2020.02.065
摘要

Knowledge Graphs (KGs) have been applied to various application scenarios including Web searching, Q&A, recommendation system, natural language processing and so on. However, the vast majority of Knowledge Bases (KBs) are incomplete, necessitating a demand for KB completion (KBC). Methods of KBC used in the mainstream current knowledge base include the latent factor model, the random walk model and recent popular methods based on reinforcement learning, which performs well in their respective areas of expertise. Recurrent neural network (RNN) and its variants model temporal data by remembering information for long periods, however, whether they also have the ability to use the information they have already remembered to achieve complex reasoning in the knowledge graph. In this paper, we produce a novel framework (ALSTM) based on the Attention mechanism and Long Short-Term Memory (LSTM), which associates structure learning with parameter learning of first-order logical rules in an end-to-end differentiable neural networks model. In this framework, we designed a memory system and employed a multi-head dot product attention (MHDPA) to interact and update the memories embedded in the memory system for reasoning purposes. This is also consistent with the process of human cognition and reasoning, looking for enlightenment for the future in historical memory. In addition, we explored the use of inductive bias in deep learning to facilitate learning of entities, relations, and rules. Experiments establish the efficiency and effectiveness of our model and show that our method achieves better performance in tasks which include fact prediction and link prediction than baseline models on several benchmark datasets such as WN18RR, FB15K-237 and NELL-995.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
33发布了新的文献求助10
刚刚
深情安青应助壮观以松采纳,获得10
刚刚
2秒前
稳重代容发布了新的文献求助10
3秒前
3秒前
Amor发布了新的文献求助10
7秒前
9秒前
ding应助涟漪采纳,获得10
11秒前
xl完成签到 ,获得积分10
11秒前
jimmy完成签到 ,获得积分10
13秒前
卢邹邹完成签到,获得积分10
16秒前
18秒前
窦慕卉发布了新的文献求助30
19秒前
瓷儿发布了新的文献求助10
21秒前
风月隽永诗完成签到,获得积分10
21秒前
hdc12138发布了新的文献求助10
21秒前
21秒前
CipherSage应助干冷安采纳,获得10
21秒前
忍冬半夏发布了新的文献求助10
25秒前
沉默的觅云完成签到,获得积分10
28秒前
29秒前
思源应助能量球采纳,获得10
29秒前
Wiz111完成签到 ,获得积分10
30秒前
31秒前
wuludie发布了新的文献求助10
32秒前
小蓝发布了新的文献求助10
32秒前
32秒前
科研通AI2S应助木皆采纳,获得10
35秒前
orixero应助小志Ya采纳,获得10
35秒前
35秒前
37秒前
37秒前
在水一方发布了新的文献求助10
38秒前
38秒前
今后应助梦里江南采纳,获得10
38秒前
39秒前
忍冬半夏完成签到,获得积分10
39秒前
40秒前
涟漪发布了新的文献求助10
40秒前
搜集达人应助风月隽永诗采纳,获得10
42秒前
高分求助中
Lire en communiste 1000
Ore genesis in the Zambian Copperbelt with particular reference to the northern sector of the Chambishi basin 800
Mantiden: Faszinierende Lauerjäger Faszinierende Lauerjäger 700
PraxisRatgeber: Mantiden: Faszinierende Lauerjäger 700
Becoming: An Introduction to Jung's Concept of Individuation 600
Evolution 3rd edition 500
Die Gottesanbeterin: Mantis religiosa: 656 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3171276
求助须知:如何正确求助?哪些是违规求助? 2822139
关于积分的说明 7938382
捐赠科研通 2482666
什么是DOI,文献DOI怎么找? 1322693
科研通“疑难数据库(出版商)”最低求助积分说明 633708
版权声明 602627