计算机科学
邻接表
编码器
阈值
变压器
人工智能
数据挖掘
邻接矩阵
图形
机器学习
算法
理论计算机科学
物理
量子力学
电压
图像(数学)
操作系统
作者
Zili Geng,Jie Xu,Rongsen Wu,Changming Zhao,Jin Wang,Yunji Li,Chenlin Zhang
标识
DOI:10.1016/j.inffus.2024.102228
摘要
Traffic flow prediction is a critical component of Intelligent Transportation Systems (ITS). However, the dynamic temporal variations in traffic flow, especially in potential occurrence of unexpected incidents, pose challenges to the prediction of traffic flow. This paper proposes a Spatial–temporal Gated Attention Transformer (STGAFormer) model based Graph Neural Network(GNN), leveraging the encoder architecture of the transformer. The gated temporal self-attention in the model, a novel module, can improve the model’s ability to make long-term predictions and handle sudden traffic incidents by enhancing the extraction of both local and global temporal features. Additionally, this paper proposes a distance spatial self-attention module to extract spatial features, which employs thresholding to selectively identify crucial features from both nearby and distant regions. In this way, the model’s ability to assimilate critical spatial information is promoted. Moreover, our model incorporates a diverse range of inputs, including traffic flow attributes, periodicity, proximity adjacency matrix, and adaptive adjacency matrix. Experiments from four real datasets demonstrate that STGAFormer achieves state-of-the-art performance, especially the MAE value of the PeMS08 dataset experiment is improved by 3.82%. This method offers valuable insights and robust support for future transportation planning.
科研通智能强力驱动
Strongly Powered by AbleSci AI