计算机科学
同性恋
概化理论
图形
人工智能
机器学习
人工神经网络
节点(物理)
理论计算机科学
数学
结构工程
统计
组合数学
工程类
作者
Jingfan Chen,Guanghui Zhu,Yifan Qi,Chunfeng Yuan,Yihua Huang
标识
DOI:10.1145/3511808.3557478
摘要
Recently emerged heterophilous graph neural networks have significantly reduced the reliance on the assumption of graph homophily where linked nodes have similar features and labels. These methods focus on a supervised setting that relies on labeling information heavily and presents the limitations on general graph downstream tasks. In this work, we propose a self-supervised representation learning paradigm on graphs with heterophily (namely HGRL) for improving the generalizability of node representations, where node representations are optimized without any label guidance. Inspired by the designs of existing heterophilous graph neural networks, HGRL learns the node representations by preserving the node original features and capturing informative distant neighbors. Such two properties are obtained through carefully designed pretext tasks that are optimized based on estimated high-order mutual information. Theoretical analysis interprets the connections between HGRL and existing advanced graph neural network designs. Extensive experiments on different downstream tasks demonstrate the effectiveness of the proposed framework.
科研通智能强力驱动
Strongly Powered by AbleSci AI