知识图
计算机科学
嵌入
翻译(生物学)
图形
变量(数学)
人工智能
自然语言处理
理论计算机科学
数学
生物
数学分析
生物化学
信使核糖核酸
基因
作者
Yadan Han,Guangquan Lu,Chengqi Zhang,Liang Zhang,Cuifang Zou,Guoqiu Wen
出处
期刊:Tsinghua Science & Technology
[Tsinghua University Press]
日期:2024-05-02
卷期号:29 (5): 1554-1565
标识
DOI:10.26599/tst.2023.9010142
摘要
Knowledge representation learning (KRL) aims to encode entities and relationships in various knowledge graphs into low-dimensional continuous vectors. It is popularly used in knowledge graph completion (or link prediction) tasks. Translation-based knowledge representation learning methods perform well in knowledge graph completion (KGC). However, the translation principles adopted by these methods are too strict and cannot model complex entities and relationships (i.e., N-1, 1-N, and N-N) well. Besides, these traditional translation principles are primarily used in static knowledge graphs and overlook the temporal properties of triplet facts. Therefore, we propose a temporal knowledge graph embedding model based on variable translation (TKGE-VT). The model proposes a new variable translation principle, which enables flexible transformation between entities and relationship embedding. Meanwhile, this paper considers the temporal properties of both entities and relationships and applies the proposed principle of variable translation to temporal knowledge graphs. We conduct link prediction and triplet classification experiments on four benchmark datasets: WN11, WN18, FB13, and FB15K. Our model outperforms baseline models on multiple evaluation metrics according to the experimental results.
科研通智能强力驱动
Strongly Powered by AbleSci AI