GSSA: Pay attention to graph feature importance for GCN via statistical self-attention

计算机科学 特征(语言学) 图形 人工智能 特征学习 节点(物理) 机器学习 卷积神经网络 模式识别(心理学) 理论计算机科学 数据挖掘 语言学 结构工程 工程类 哲学
作者
Zheng Jin,Yan Wang,Wanjun Xu,Zilu Gan,Ping Li,Jiancheng Lv
出处
期刊:Neurocomputing [Elsevier]
卷期号:417: 458-470 被引量:11
标识
DOI:10.1016/j.neucom.2020.07.098
摘要

Graph convolutional network (GCN) has been proved to be an effective framework for graph-based semi-supervised learning applications. The core operation block of GCN is the convolutional layer, which enables the network to construct node embeddings by fusing both attributes of nodes and relationships between nodes. Different features or feature interactions inherently have various influences on the convolutional layers. However, there are very limited studies about the impact of feature importance in GCN-related communities. In this work, we attempt to augment convolutional layers in GCNs with statistical attention-based feature importance by modeling the latent interactions of features, which is complementary to the standard GCNs and only needs simple calculations with statistics rather than heavy trainings. To this end, we treat the feature input of each convolutional layer as a separate multi-layer heterogeneous graph, and propose Graph Statistical Self-Attention (GSSA) method to automatically learn the hierarchical structure of feature importance. More specifically, we propose two modules in GSSA, Channel-wise Self-Attention (CSA) to capture the dependencies between feature channels, and Mean-based Self-Attention (MSA) to reweight similarities among features. Aiming at each graph convolutional layer, GSSA can be applied in a "plug and play" way for a wide range of GCN variants. To the best of our knowledge, this is the first implementation that optimizes GCNs from the feature importance perspective. Extensive experiments demonstrate that GSSA can promote existing popular baselines remarkably in semi-supervised node classification tasks. We further employ multiple qualitative evaluations to get deep insights into our method.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
打打应助Xerxez采纳,获得10
1秒前
伞下铭发布了新的文献求助10
2秒前
领导范儿应助zppp采纳,获得10
2秒前
3秒前
王佳亮完成签到,获得积分10
3秒前
zenzi完成签到,获得积分20
4秒前
小雨完成签到,获得积分10
5秒前
量子星尘发布了新的文献求助10
5秒前
1an完成签到,获得积分10
6秒前
Nancy发布了新的文献求助10
6秒前
青青松树枝完成签到,获得积分10
6秒前
瘦瘦发布了新的文献求助20
6秒前
汉堡包应助不医人采纳,获得10
7秒前
小雨发布了新的文献求助10
8秒前
爆米花应助Steven采纳,获得10
8秒前
8秒前
newnew完成签到,获得积分10
8秒前
8秒前
9秒前
9秒前
ding应助磐xst采纳,获得10
12秒前
原野完成签到,获得积分10
12秒前
科研通AI6应助Nancy采纳,获得10
12秒前
12秒前
huilin发布了新的文献求助10
12秒前
13秒前
niNe3YUE应助薄荷采纳,获得10
13秒前
13秒前
何木萧完成签到,获得积分10
13秒前
丫丫完成签到,获得积分10
15秒前
Ava应助缥缈傥采纳,获得10
15秒前
16秒前
17秒前
17秒前
量子星尘发布了新的文献求助10
17秒前
huilin完成签到,获得积分10
18秒前
wenjing发布了新的文献求助10
19秒前
aaa发布了新的文献求助10
19秒前
是个哑巴完成签到,获得积分10
19秒前
Chicophy发布了新的文献求助10
19秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Translanguaging in Action in English-Medium Classrooms: A Resource Book for Teachers 700
Exploring Nostalgia 500
Natural Product Extraction: Principles and Applications 500
Exosomes Pipeline Insight, 2025 500
Qualitative Data Analysis with NVivo By Jenine Beekhuyzen, Pat Bazeley · 2024 500
Advanced Memory Technology: Functional Materials and Devices 400
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5667047
求助须知:如何正确求助?哪些是违规求助? 4883873
关于积分的说明 15118527
捐赠科研通 4825937
什么是DOI,文献DOI怎么找? 2583643
邀请新用户注册赠送积分活动 1537807
关于科研通互助平台的介绍 1496002