子图同构问题
计算机科学
规范化(社会学)
诱导子图同构问题
理论计算机科学
图形
数据挖掘
机器学习
人工智能
折线图
电压图
社会学
人类学
作者
Kaixuan Chen,Shunyu Liu,Tongtian Zhu,Qiao Ji,Yun Su,Yingjie Tian,Tongya Zheng,Haofei Zhang,Zunlei Feng,Jingwen Ye,Mingli Song
标识
DOI:10.1145/3580305.3599388
摘要
Graph Neural Networks~(GNNs) have emerged as a powerful category of learning architecture for handling graph-structured data. However, existing GNNs typically ignore crucial structural characteristics in node-induced subgraphs, which thus limits their expressiveness for various downstream tasks. In this paper, we strive to strengthen the representative capabilities of GNNs by devising a dedicated plug-and-play normalization scheme, termed as SUbgraph-sPEcific FactoR Embedded Normalization (SuperNorm), that explicitly considers the intra-connection information within each node-induced subgraph. To this end, we embed the subgraph-specific factor at the beginning and the end of the standard BatchNorm, as well as incorporate graph instance-specific statistics for improved distinguishable capabilities. In the meantime, we provide theoretical analysis to support that, with the elaborated SuperNorm, an arbitrary GNN is at least as powerful as the 1-WL test in distinguishing non-isomorphism graphs. Furthermore, the proposed SuperNorm scheme is also demonstrated to alleviate the over-smoothing phenomenon. Experimental results related to predictions of graph, node, and link properties on the eight popular datasets demonstrate the effectiveness of the proposed method. The code is available at https://github.com/chenchkx/SuperNorm.
科研通智能强力驱动
Strongly Powered by AbleSci AI