计算机科学
人工神经网络
节点(物理)
图形
特征(语言学)
注意力网络
编码(集合论)
理论计算机科学
人工智能
哲学
结构工程
工程类
语言学
集合(抽象数据类型)
程序设计语言
作者
Shenzhi Yang,Li Zhang,Xiaofang Zhang
标识
DOI:10.1109/icassp48485.2024.10447142
摘要
Graph attention neural network (GAT) stands as a fundamental model within graph neural networks, extensively employed across various applications. It assigns different weights to different nodes for feature aggregation by comparing the similarity of features between nodes. However, as the amount and density of graph data increases, GAT's computational demands rise steeply. In response, we present FastGAT, a simpler and more efficient graph attention neural network with global-aware adaptive computational node attention. FastGAT assigns a trainable attention weight to each node and updates it adaptively. Experiments show that FastGAT reduces training time on eight public datasets by 6.22% to 19.50% while maintaining the same performance. The code is available via https://github.com/szYang2000/FastGAT.
科研通智能强力驱动
Strongly Powered by AbleSci AI