Augmented Graph Neural Network with hierarchical global-based residual connections

计算机科学 联营 残余物 理论计算机科学 图形 人工神经网络 平滑的 图形属性 人工智能 算法 折线图 电压图 计算机视觉
作者
Asmaa Rassil,Hiba Chougrad,Hamid Zouaki
出处
期刊:Neural Networks [Elsevier]
卷期号:150: 149-166 被引量:16
标识
DOI:10.1016/j.neunet.2022.03.008
摘要

Graph Neural Networks (GNNs) are powerful architectures for learning on graphs. They are efficient for predicting nodes, links and graphs properties. Standard GNN variants follow a message passing schema to update nodes representations using information from higher-order neighborhoods iteratively. Consequently, deeper GNNs make it possible to define high-level nodes representations generated based on local as well as distant neighborhoods. However, deeper networks are prone to suffer from over-smoothing. To build deeper GNN architectures and avoid losing the dependency between lower (the layers closer to the input) and higher (the layers closer to the output) layers, networks can integrate residual connections to connect intermediate layers. We propose the Augmented Graph Neural Network (AGNN) model with hierarchical global-based residual connections. Using the proposed residual connections, the model generates high-level nodes representations without the need for a deeper architecture. We disclose that the nodes representations generated through our proposed AGNN model are able to define an expressive all-encompassing representation of the entire graph. As such, the graph predictions generated through the AGNN model surpass considerably state-of-the-art results. Moreover, we carry out extensive experiments to identify the best global pooling strategy and attention weights to define the adequate hierarchical and global-based residual connections for different graph property prediction tasks. Furthermore, we propose a reversible variant of the AGNN model to address the extensive memory consumption problem that typically arises from training networks on large and dense graph datasets. The proposed Reversible Augmented Graph Neural Network (R-AGNN) only stores the nodes representations acquired from the output layer as opposed to saving all representations from intermediate layers as it is conventionally done when optimizing the parameters of other GNNs. We further refine the definition of the backpropagation algorithm to fit the R-AGNN model. We evaluate the proposed models AGNN and R-AGNN on benchmark Molecular, Bioinformatics and Social Networks datasets for graph classification and achieve state-of-the-art results. For instance the AGNN model realizes improvements of +39% on IMDB-MULTI reaching 91.7% accuracy and +16% on COLLAB reaching 96.8% accuracy compared to other GNN variants.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
小白完成签到 ,获得积分10
刚刚
1秒前
3秒前
973完成签到,获得积分10
4秒前
激昂的航空应助xinyi采纳,获得10
4秒前
5秒前
5秒前
合适的凡完成签到,获得积分10
5秒前
韦广阔发布了新的文献求助10
5秒前
WSYang完成签到,获得积分10
5秒前
噜噜噜发布了新的文献求助10
6秒前
6秒前
CodeCraft应助ww采纳,获得10
6秒前
6秒前
7秒前
天天快乐应助张会采纳,获得10
7秒前
伶俐以彤发布了新的文献求助10
7秒前
我是老大应助大方颦采纳,获得10
8秒前
8秒前
寻道图强应助科研通管家采纳,获得60
8秒前
zzzzz发布了新的文献求助10
8秒前
LiuJiateng应助科研通管家采纳,获得10
8秒前
9秒前
NexusExplorer应助科研通管家采纳,获得10
9秒前
共享精神应助科研通管家采纳,获得10
9秒前
wanci应助科研通管家采纳,获得10
9秒前
9秒前
LiuJiateng应助科研通管家采纳,获得10
9秒前
9秒前
9秒前
英姑应助科研通管家采纳,获得10
9秒前
斯文败类应助科研通管家采纳,获得10
9秒前
大个应助科研通管家采纳,获得10
10秒前
桐桐应助科研通管家采纳,获得10
10秒前
寻道图强应助科研通管家采纳,获得50
10秒前
10秒前
10秒前
LiuJiateng应助科研通管家采纳,获得10
10秒前
彭于晏应助科研通管家采纳,获得10
10秒前
小二郎应助科研通管家采纳,获得10
10秒前
高分求助中
Modern Epidemiology, Fourth Edition 5000
Kinesiophobia : a new view of chronic pain behavior 5000
Molecular Biology of Cancer: Mechanisms, Targets, and Therapeutics 3000
Digital Twins of Advanced Materials Processing 2000
Propeller Design 2000
Weaponeering, Fourth Edition – Two Volume SET 2000
Handbook of pharmaceutical excipients, Ninth edition 1500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 纳米技术 化学工程 生物化学 物理 计算机科学 内科学 复合材料 催化作用 物理化学 光电子学 电极 冶金 细胞生物学 基因
热门帖子
关注 科研通微信公众号,转发送积分 6011101
求助须知:如何正确求助?哪些是违规求助? 7559327
关于积分的说明 16136201
捐赠科研通 5157911
什么是DOI,文献DOI怎么找? 2762565
邀请新用户注册赠送积分活动 1741231
关于科研通互助平台的介绍 1633582