ProSTformer: Progressive Space-Time Self-Attention Model for Short-Term Traffic Flow Forecasting

亲密度 流量(计算机网络) 计算 计算机科学 数据挖掘 流量(数学) 空格(标点符号) 比例(比率) 人工智能 地理 算法 数学 地图学 计算机安全 数学分析 几何学 操作系统
作者
Yan Xiao,Xianghua Gan,Jingjing Tang,Dapeng Zhang,Rui Wang
出处
期刊:IEEE Transactions on Intelligent Transportation Systems [Institute of Electrical and Electronics Engineers]
卷期号:25 (9): 10802-10816 被引量:2
标识
DOI:10.1109/tits.2024.3367754
摘要

Traffic flow forecasting is essential and challenging to intelligent city management and public safety. In this paper, we attempt to use a pure self-attention method in traffic flow forecasting. However, when dealing with input sequences, including large-scale regions' historical records, it is difficult for the self-attention mechanism to focus on the most relevant ones for forecasting. To address this issue, we design a progressive space-time self-attention mechanism named ProSTformer, which can reduce self-attention computation times from thousands to tens. Our design is based on two pieces of prior knowledge in the traffic flow forecasting literature: (i) spatiotemporal dependencies can be factorized into spatial and temporal dependencies; (ii) adjacent regions have more influences than distant regions, and temporal characteristics of closeness, period and trend are more important than crossed relations between them. Our ProSTformer has two characteristics. First, each block in ProSTformer highlights the unique dependencies, ProSTformer progressively focuses on spatial dependencies from local to global regions, on temporal dependencies from closeness, period and trend to crossed relations between them, and on external dependencies such as weather conditions, temperature and day-of-week. Second, we use the Tensor Rearranging technique to force the model to compute self-attention only to adjacent regions and to the unique temporal characteristic. Then, we use the Patch Merging technique to greatly reduce self-attention computation times to distant regions and crossed temporal relations. We evaluate ProSTformer on two traffic datasets and find that it performs better than sixteen baseline models. The code is available at https://github.com/yanxiao1930/ProSTformer_code/tree/main.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
贪玩丸子完成签到,获得积分10
刚刚
神勇的雅香应助liutaili采纳,获得10
1秒前
KSGGS完成签到,获得积分10
1秒前
YANG关注了科研通微信公众号
1秒前
2秒前
2秒前
2秒前
99发布了新的文献求助10
3秒前
3秒前
科研通AI5应助qi采纳,获得10
3秒前
乐乐发布了新的文献求助10
4秒前
铸一字错发布了新的文献求助10
4秒前
受伤书文完成签到,获得积分10
5秒前
Yvonne发布了新的文献求助10
5秒前
5秒前
温柔的十三完成签到,获得积分10
5秒前
Ll发布了新的文献求助10
6秒前
nikai发布了新的文献求助10
6秒前
圣晟胜发布了新的文献求助10
6秒前
大个应助科研通管家采纳,获得10
6秒前
6秒前
田様应助科研通管家采纳,获得10
6秒前
香蕉觅云应助科研通管家采纳,获得10
6秒前
李爱国应助科研通管家采纳,获得10
6秒前
Leif应助科研通管家采纳,获得10
7秒前
桐桐应助科研通管家采纳,获得10
7秒前
Owen应助科研通管家采纳,获得10
7秒前
7秒前
深情安青应助科研通管家采纳,获得10
7秒前
shouyu29应助科研通管家采纳,获得10
7秒前
7秒前
小金应助科研通管家采纳,获得20
7秒前
牛逼的昂完成签到,获得积分10
7秒前
muzi给muzi的求助进行了留言
7秒前
NexusExplorer应助科研通管家采纳,获得10
7秒前
7秒前
Jasper应助科研通管家采纳,获得10
8秒前
yuhang完成签到 ,获得积分10
8秒前
汉堡包应助科研通管家采纳,获得10
8秒前
果汁完成签到,获得积分10
8秒前
高分求助中
Continuum Thermodynamics and Material Modelling 3000
Production Logging: Theoretical and Interpretive Elements 2700
Social media impact on athlete mental health: #RealityCheck 1020
Ensartinib (Ensacove) for Non-Small Cell Lung Cancer 1000
Unseen Mendieta: The Unpublished Works of Ana Mendieta 1000
Bacterial collagenases and their clinical applications 800
El viaje de una vida: Memorias de María Lecea 800
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 量子力学 光电子学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3527699
求助须知:如何正确求助?哪些是违规求助? 3107752
关于积分的说明 9286499
捐赠科研通 2805513
什么是DOI,文献DOI怎么找? 1539954
邀请新用户注册赠送积分活动 716878
科研通“疑难数据库(出版商)”最低求助积分说明 709759