计算机科学
图形
节点(物理)
注意力网络
理论计算机科学
人工智能
物理
量子力学
作者
Keao Lin,Xiaozhu Xie,Wei Weng,Xiaofeng Du
标识
DOI:10.1093/comjnl/bxae060
摘要
Abstract Graph Neural Networks (GNNs) are deep learning models specifically designed for analyzing graph-structured data, capturing complex relationships and structures to improve analysis and prediction. A common task in GNNs is node classification, where each node in the graph is assigned a predefined category. The Graph Attention Network (GAT) is a popular variant of GNNs known for its ability to capture complex dependencies by assigning importance weights to nodes during information aggregation. However, the GAT’s reliance on local attention mechanisms limits its effectiveness in capturing global information and long-range dependencies. To address this limitation, we propose a new attention mechanism called Global-Local Graph Attention (GLGA). Our mechanism enables the GAT to capture long-range dependencies and global graph structures while maintaining its ability to focus on local interactions. We evaluate our algorithm on three citation datasets (Cora, Citeseer, and Pubmed) using multiple metrics, demonstrating its superiority over other baseline models. The proposed GLGA mechanism has been proven to be an effective solution for improving node classification tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI