计算机科学
简单(哲学)
人工神经网络
节点(物理)
图形
注意力网络
理论计算机科学
人工智能
哲学
结构工程
认识论
工程类
作者
Shenzhi Yang,Li Zhang,Xiaofang Zhang
标识
DOI:10.1109/icassp48485.2024.10447142
摘要
Graph attention neural network (GAT) stands as a fundamental model within graph neural networks, extensively employed across various applications. It assigns different weights to different nodes for feature aggregation by comparing the similarity of features between nodes. However, as the amount and density of graph data increases, GAT's computational demands rise steeply. In response, we present FastGAT, a simpler and more efficient graph attention neural network with global-aware adaptive computational node attention. FastGAT assigns a trainable attention weight to each node and updates it adaptively. Experiments show that FastGAT reduces training time on eight public datasets by 6.22% to 19.50% while maintaining the same performance. The code is available via https://github.com/szYang2000/FastGAT.
科研通智能强力驱动
Strongly Powered by AbleSci AI