计算机科学
图形
注意力网络
机制(生物学)
块(置换群论)
人工神经网络
人工智能
算法
理论计算机科学
数据挖掘
几何学
数学
认识论
哲学
作者
Zhilong Lu,Weifeng Lv,Zhipu Xie,Bowen Du,Guixi Xiong,Leilei Sun,Haiquan Wang
出处
期刊:ACM Transactions on Intelligent Systems and Technology
[Association for Computing Machinery]
日期:2022-03-04
卷期号:13 (2): 1-24
被引量:14
摘要
Recent years have witnessed the emerging success of Graph Neural Networks (GNNs) for modeling graphical data. A GNN can model the spatial dependencies of nodes in a graph based on message passing through node aggregation. However, in many application scenarios, these spatial dependencies can change over time, and a basic GNN model cannot capture these changes. In this article, we propose a G raph S eq uence neural network with an A tt ention mechanism (GSeqAtt) for processing graph sequences. More specifically, two attention mechanisms are combined: a horizontal mechanism and a vertical mechanism. GTransformer, which is a horizontal attention mechanism for handling time series, is used to capture the correlations between graphs in the input time sequence. The vertical attention mechanism, a Graph Network (GN) block structure with an attention mechanism (GNAtt), acts within the graph structure in each frame of the time series. Experiments show that our proposed model is able to handle information propagation for graph sequences accurately and efficiently. Moreover, results on real-world data from three road intersections show that our GSeqAtt outperforms state-of-the-art baselines on the traffic speed prediction task.
科研通智能强力驱动
Strongly Powered by AbleSci AI