计算机科学
过度拟合
期望传播
信仰传播
消息传递
特征(语言学)
图形
嵌入
推论
人工智能
机器学习
邻接矩阵
人工神经网络
模式识别(心理学)
理论计算机科学
算法
解码方法
哲学
物理
高斯分布
量子力学
程序设计语言
高斯过程
语言学
作者
Yunsheng Shi,Zhengjie Huang,Wenjin Wang,Hui Zhong,Shikun Feng,Yu Sun
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:47
标识
DOI:10.48550/arxiv.2009.03509
摘要
Graph neural network (GNN) and label propagation algorithm (LPA) are both message passing algorithms, which have achieved superior performance in semi-supervised classification. GNN performs feature propagation by a neural network to make predictions, while LPA uses label propagation across graph adjacency matrix to get results. However, there is still no effective way to directly combine these two kinds of algorithms. To address this issue, we propose a novel Unified Message Passaging Model (UniMP) that can incorporate feature and label propagation at both training and inference time. First, UniMP adopts a Graph Transformer network, taking feature embedding and label embedding as input information for propagation. Second, to train the network without overfitting in self-loop input label information, UniMP introduces a masked label prediction strategy, in which some percentage of input label information are masked at random, and then predicted. UniMP conceptually unifies feature propagation and label propagation and is empirically powerful. It obtains new state-of-the-art semi-supervised classification results in Open Graph Benchmark (OGB).
科研通智能强力驱动
Strongly Powered by AbleSci AI