期刊:IEEE/ACM transactions on audio, speech, and language processing [Institute of Electrical and Electronics Engineers] 日期:2023-01-01卷期号:31: 2246-2258被引量:11
标识
DOI:10.1109/taslp.2023.3282101
摘要
Temporal knowledge graph embedding (TKGE) aims to learn the embedding of entities and relations in a temporal knowledge graph (TKG). Although the previous graph neural networks (GNN) based models have achieved promising results, they cannot directly capture the interactions of multi-facts at different timestamps. To address the above limitation, we propose a time-aware relational graph attention model (TARGAT), which takes the multi-facts at different timestamps as a unified graph. First, we develop a relational generator to dynamically generate a series of time-aware relational message transformation matrices, which jointly models the relations and the timestamp information into a unified way. Then, we apply the generated message transformation matrices to project the neighborhood features into different time-aware spaces and aggregate these neighborhood features to explicitly capture the interactions of multi-facts. Finally, a temporal transformer classifier is applied to learn the representation of the query quadruples and predict the missing entities. The experimental results show that our TARGAT model beats the GNN-based models by a large margin and achieves new state-of-the-art results on four popular benchmark datasets.