Hidformer: Hierarchical dual-tower transformer using multi-scale mergence for long-term time series forecasting

计算机科学 变压器 合并(版本控制) 人工智能 机器学习 电压 情报检索 量子力学 物理
作者
Zhaoran Liu,Yizhi Cao,Hu Xu,Yuxin Huang,Qunshan He,Xinjie Chen,Xiaoyu Tang,Xinggao Liu
出处
期刊:Expert Systems With Applications [Elsevier]
卷期号:239: 122412-122412 被引量:27
标识
DOI:10.1016/j.eswa.2023.122412
摘要

Long-term time series forecasting has received a lot of popularity because of its great practicality. It is also an extremely challenging task since it requires using limited observations to predict values in the long future accurately. Recent works have demonstrated that Transformer has strong potential for this task. However, the permutation-invariant property of the Transformer and some other prominent shortcomings in the current Transformer-based models, such as missing multi-scale local features and information from the frequency domain, significantly limit their performance. To improve the accuracy of the long-term time series forecasting, we propose a Transformer-based model called Hidformer. This model can either learn temporal dynamics from the time domain or discover particular patterns from the frequency domain. We also design a segment-and-merge architecture to provide semantic meanings for the inputs and help the model capture multi-scale local features. Besides, we replace Transformer's multi-head attention with highly-efficient recurrence and linear attention, which gives our model an advantage over other Transformer-based models in terms of computational efficiency. Extensive experiments are conducted on seven real-world benchmarks to verify the effectiveness of Hidformer. The experimental results show that Hidformer achieves 72 top-1 and 69 top-2 scores out of 88 configurations. It dramatically improves the prediction accuracy and outperforms the previous state-of-the-art, proving the superiority of our proposed method.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
十一发布了新的文献求助10
1秒前
zHu1完成签到,获得积分10
1秒前
1秒前
下隔热不完成签到,获得积分10
1秒前
量子星尘发布了新的文献求助10
2秒前
2秒前
2秒前
zzz完成签到 ,获得积分10
2秒前
李健的小迷弟应助小标采纳,获得30
3秒前
3秒前
量子星尘发布了新的文献求助10
3秒前
mingzzz1完成签到,获得积分10
3秒前
forever发布了新的文献求助10
4秒前
Boston完成签到,获得积分10
4秒前
大力帽子应助yyy采纳,获得10
4秒前
Owen应助hang采纳,获得10
4秒前
xftx完成签到 ,获得积分10
5秒前
5秒前
嗯呐发布了新的文献求助10
5秒前
5秒前
完美的采珊完成签到,获得积分10
5秒前
5秒前
干净的烧鹅完成签到,获得积分10
6秒前
anjia发布了新的文献求助30
6秒前
7秒前
7秒前
万能图书馆应助Ren采纳,获得10
7秒前
lynn完成签到,获得积分10
8秒前
8秒前
8秒前
smottom应助新帅采纳,获得10
8秒前
tiptip应助细腻天蓝采纳,获得50
9秒前
9秒前
打打应助maoxinnan采纳,获得10
9秒前
9秒前
9秒前
10秒前
ICBC完成签到 ,获得积分10
10秒前
wear88发布了新的文献求助10
11秒前
高分求助中
2025-2031全球及中国金刚石触媒粉行业研究及十五五规划分析报告 12000
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
The Cambridge History of China: Volume 4, Sui and T'ang China, 589–906 AD, Part Two 1000
The Composition and Relative Chronology of Dynasties 16 and 17 in Egypt 1000
Russian Foreign Policy: Change and Continuity 800
Qualitative Data Analysis with NVivo By Jenine Beekhuyzen, Pat Bazeley · 2024 800
Advanced Memory Technology: Functional Materials and Devices 700
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5692559
求助须知:如何正确求助?哪些是违规求助? 5089055
关于积分的说明 15208836
捐赠科研通 4849783
什么是DOI,文献DOI怎么找? 2601280
邀请新用户注册赠送积分活动 1553052
关于科研通互助平台的介绍 1511274