已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

Graphformer: Adaptive graph correlation transformer for multivariate long sequence time series forecasting

计算机科学 粒度 图形 人工智能 增采样 概括性 编码器 数据挖掘 算法 模式识别(心理学) 机器学习 理论计算机科学 心理学 图像(数学) 心理治疗师 操作系统
作者
Yijie Wang,Hao Long,Linjiang Zheng,Jiaxing Shang
出处
期刊:Knowledge Based Systems [Elsevier BV]
卷期号:285: 111321-111321 被引量:40
标识
DOI:10.1016/j.knosys.2023.111321
摘要

Accurate long sequence time series forecasting (LSTF) remains a key challenge due to its complex time-dependent nature. Multivariate time series forecasting methods inherently assume that variables are interrelated and that the future state of each variable depends not only on its history but also on other variables. However, most existing methods, such as Transformer, cannot effectively exploit the potential spatial correlation between variables. To cope with the above problems, we propose a Transformer-based LSTF model, called Graphformer, which can efficiently learn complex temporal patterns and dependencies between multiple variables. First, in the encoder's self-attentive downsampling layer, Graphformer replaces the standard convolutional layer with an dilated convolutional layer to efficiently capture long-term dependencies between time series at different granularity levels. Meanwhile, Graphformer replaces the self-attention mechanism with a graph self-attention mechanism that can automatically infer the implicit sparse graph structure from the data, showing better generality for time series without explicit graph structure and learning implicit spatial dependencies between sequences. In addition, Graphformer uses a temporal inertia module to enhance the sensitivity of future time steps to recent inputs, and a multi-scale feature fusion operation to extract temporal correlations at different granularity levels by slicing and fusing feature maps to improve model accuracy and efficiency. Our proposed Graphformer can improve the long sequence time series forecasting accuracy significantly when compared with that of SOTA Transformer-based models.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
开放若灵完成签到,获得积分10
2秒前
李文娜完成签到 ,获得积分10
4秒前
Leo发布了新的文献求助10
6秒前
金鱼完成签到,获得积分10
6秒前
星辰大海应助勤奋寻雪采纳,获得10
7秒前
9秒前
10秒前
小马甲应助无绮采纳,获得10
11秒前
两栖玩家发布了新的文献求助10
15秒前
April发布了新的文献求助10
17秒前
18秒前
赘婿应助科研通管家采纳,获得10
19秒前
19秒前
JamesPei应助科研通管家采纳,获得10
19秒前
ding应助科研通管家采纳,获得20
20秒前
molihuakai应助科研通管家采纳,获得10
20秒前
NexusExplorer应助科研通管家采纳,获得10
20秒前
隐形曼青应助科研通管家采纳,获得10
20秒前
FashionBoy应助科研通管家采纳,获得10
20秒前
桐桐应助科研通管家采纳,获得10
20秒前
20秒前
Jx完成签到 ,获得积分10
21秒前
23秒前
无绮发布了新的文献求助10
26秒前
烟花应助oqo采纳,获得10
28秒前
完美世界应助fanyy采纳,获得10
29秒前
传奇3应助effervescence采纳,获得10
30秒前
JamesPei应助18726352502采纳,获得10
31秒前
顾矜应助谨慎的月亮采纳,获得10
33秒前
34秒前
王大壮完成签到,获得积分0
36秒前
火火发布了新的文献求助10
40秒前
40秒前
丘比特应助exp采纳,获得20
40秒前
幸福铸海完成签到 ,获得积分10
41秒前
April发布了新的文献求助10
41秒前
Shonso发布了新的文献求助50
42秒前
小马甲应助纯真的伯云采纳,获得30
43秒前
橘子有点酸完成签到 ,获得积分10
43秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Picture this! Including first nations fiction picture books in school library collections 2000
The Cambridge History of China: Volume 4, Sui and T'ang China, 589–906 AD, Part Two 1500
Cowries - A Guide to the Gastropod Family Cypraeidae 1200
ON THE THEORY OF BIRATIONAL BLOWING-UP 666
Signals, Systems, and Signal Processing 610
Chemistry and Physics of Carbon Volume 15 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6388829
求助须知:如何正确求助?哪些是违规求助? 8203259
关于积分的说明 17357617
捐赠科研通 5442448
什么是DOI,文献DOI怎么找? 2877964
邀请新用户注册赠送积分活动 1854319
关于科研通互助平台的介绍 1697853