计算机科学
地点
参考地
深度学习
特征学习
交通拥挤
变压器
人工智能
数据挖掘
机器学习
实时计算
工程类
操作系统
隐藏物
电气工程
哲学
电压
语言学
运输工程
作者
Fang Yu,Fang Zhao,Yanjun Qin,Haiyong Luo,Chenxing Wang
出处
期刊:IEEE Transactions on Intelligent Transportation Systems
[Institute of Electrical and Electronics Engineers]
日期:2022-12-01
卷期号:23 (12): 23433-23446
被引量:14
标识
DOI:10.1109/tits.2022.3197640
摘要
Forecasting traffic flow and speed in the urban is important for many applications, ranging from the intelligent navigation of map applications to congestion relief of city management systems. Therefore, mining the complex spatio-temporal correlations in the traffic data to accurately predict traffic is essential for the community. However, previous studies that combined the graph convolution network or self-attention mechanism with deep time series models (e.g., the recurrent neural network) can only capture spatial dependencies in each time slot and temporal dependencies in each sensor, ignoring the spatial and temporal correlations across different time slots and sensors. Besides, the state-of-the-art Transformer architecture used in previous methods is insensitive to local spatio-temporal contexts, which is hard to suit with traffic forecasting. To solve the above two issues, we propose a novel deep learning model for traffic forecasting, named Locality-aware spatio-temporal joint Transformer (Lastjormer), which elaborately designs a spatio-temporal joint attention in the Transformer architecture to capture all dynamic dependencies in the traffic data. Specifically, our model utilizes the dot-product self-attention on sensors across many time slots to extract correlations among them and introduces the linear and convolution self-attention mechanism to reduce the computation needs and incorporate local spatio-temporal information. Experiments on three real-world traffic datasets, England, METR-LA, and PEMS-BAY, demonstrate that our Lastjormer achieves state-of-the-art performances on a variety of challenging traffic forecasting benchmarks.
科研通智能强力驱动
Strongly Powered by AbleSci AI