嵌入
计算机科学
推论
简单(哲学)
关系(数据库)
多样性(控制论)
双线性插值
任务(项目管理)
人工智能
理论计算机科学
国家(计算机科学)
关系抽取
数据挖掘
算法
哲学
管理
认识论
经济
计算机视觉
作者
Bishan Yang,Wen-tau Yih,Xiaodong He,Jianfeng Gao,Li Deng
出处
期刊:Cornell University - arXiv
日期:2014-01-01
被引量:1838
标识
DOI:10.48550/arxiv.1412.6575
摘要
We consider learning representations of entities and relations in KBs using the neural-embedding approach. We show that most existing models, including NTN (Socher et al., 2013) and TransE (Bordes et al., 2013b), can be generalized under a unified learning framework, where entities are low-dimensional vectors learned from a neural network and relations are bilinear and/or linear mapping functions. Under this framework, we compare a variety of embedding models on the link prediction task. We show that a simple bilinear formulation achieves new state-of-the-art results for the task (achieving a top-10 accuracy of 73.2% vs. 54.7% by TransE on Freebase). Furthermore, we introduce a novel approach that utilizes the learned relation embeddings to mine logical rules such as "BornInCity(a,b) and CityInCountry(b,c) => Nationality(a,c)". We find that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication. More interestingly, we demonstrate that our embedding-based rule extraction approach successfully outperforms a state-of-the-art confidence-based rule mining approach in mining Horn rules that involve compositional reasoning.
科研通智能强力驱动
Strongly Powered by AbleSci AI