计算机科学
推论
机器学习
匹配(统计)
链接(几何体)
节点(物理)
人工智能
加速
数据挖掘
人工神经网络
数学
计算机网络
结构工程
统计
操作系统
工程类
作者
Zhichun Guo,William Shiao,Shichang Zhang,Yozen Liu,Nitesh V. Chawla,Neil Shah,Tong Zhao
出处
期刊:Cornell University - arXiv
日期:2022-01-01
被引量:4
标识
DOI:10.48550/arxiv.2210.05801
摘要
Graph Neural Networks (GNNs) have shown exceptional performance in the task of link prediction. Despite their effectiveness, the high latency brought by non-trivial neighborhood data dependency limits GNNs in practical deployments. Conversely, the known efficient MLPs are much less effective than GNNs due to the lack of relational knowledge. In this work, to combine the advantages of GNNs and MLPs, we start with exploring direct knowledge distillation (KD) methods for link prediction, i.e., predicted logit-based matching and node representation-based matching. Upon observing direct KD analogs do not perform well for link prediction, we propose a relational KD framework, Linkless Link Prediction (LLP), to distill knowledge for link prediction with MLPs. Unlike simple KD methods that match independent link logits or node representations, LLP distills relational knowledge that is centered around each (anchor) node to the student MLP. Specifically, we propose rank-based matching and distribution-based matching strategies that complement each other. Extensive experiments demonstrate that LLP boosts the link prediction performance of MLPs with significant margins, and even outperforms the teacher GNNs on 7 out of 8 benchmarks. LLP also achieves a 70.68x speedup in link prediction inference compared to GNNs on the large-scale OGB dataset.
科研通智能强力驱动
Strongly Powered by AbleSci AI