亲密度
流量(计算机网络)
计算
计算机科学
数据挖掘
流量(数学)
空格(标点符号)
比例(比率)
人工智能
地理
算法
数学
地图学
计算机安全
数学分析
几何学
操作系统
作者
Yan Xiao,Xianghua Gan,Jingjing Tang,Dapeng Zhang,Rui Wang
出处
期刊:IEEE Transactions on Intelligent Transportation Systems
[Institute of Electrical and Electronics Engineers]
日期:2024-03-12
卷期号:25 (9): 10802-10816
被引量:2
标识
DOI:10.1109/tits.2024.3367754
摘要
Traffic flow forecasting is essential and challenging to intelligent city management and public safety. In this paper, we attempt to use a pure self-attention method in traffic flow forecasting. However, when dealing with input sequences, including large-scale regions' historical records, it is difficult for the self-attention mechanism to focus on the most relevant ones for forecasting. To address this issue, we design a progressive space-time self-attention mechanism named ProSTformer, which can reduce self-attention computation times from thousands to tens. Our design is based on two pieces of prior knowledge in the traffic flow forecasting literature: (i) spatiotemporal dependencies can be factorized into spatial and temporal dependencies; (ii) adjacent regions have more influences than distant regions, and temporal characteristics of closeness, period and trend are more important than crossed relations between them. Our ProSTformer has two characteristics. First, each block in ProSTformer highlights the unique dependencies, ProSTformer progressively focuses on spatial dependencies from local to global regions, on temporal dependencies from closeness, period and trend to crossed relations between them, and on external dependencies such as weather conditions, temperature and day-of-week. Second, we use the Tensor Rearranging technique to force the model to compute self-attention only to adjacent regions and to the unique temporal characteristic. Then, we use the Patch Merging technique to greatly reduce self-attention computation times to distant regions and crossed temporal relations. We evaluate ProSTformer on two traffic datasets and find that it performs better than sixteen baseline models. The code is available at https://github.com/yanxiao1930/ProSTformer_code/tree/main.
科研通智能强力驱动
Strongly Powered by AbleSci AI