Hidformer: Hierarchical dual-tower transformer using multi-scale mergence for long-term time series forecasting

计算机科学 变压器 合并(版本控制) 人工智能 机器学习 电压 物理 量子力学 情报检索
作者
Zhaoran Liu,Yizhi Cao,Hu Xu,Yuxin Huang,Qunshan He,Xinjie Chen,Xiaoyu Tang,Xinggao Liu
出处
期刊:Expert Systems With Applications [Elsevier BV]
卷期号:239: 122412-122412 被引量:17
标识
DOI:10.1016/j.eswa.2023.122412
摘要

Long-term time series forecasting has received a lot of popularity because of its great practicality. It is also an extremely challenging task since it requires using limited observations to predict values in the long future accurately. Recent works have demonstrated that Transformer has strong potential for this task. However, the permutation-invariant property of the Transformer and some other prominent shortcomings in the current Transformer-based models, such as missing multi-scale local features and information from the frequency domain, significantly limit their performance. To improve the accuracy of the long-term time series forecasting, we propose a Transformer-based model called Hidformer. This model can either learn temporal dynamics from the time domain or discover particular patterns from the frequency domain. We also design a segment-and-merge architecture to provide semantic meanings for the inputs and help the model capture multi-scale local features. Besides, we replace Transformer's multi-head attention with highly-efficient recurrence and linear attention, which gives our model an advantage over other Transformer-based models in terms of computational efficiency. Extensive experiments are conducted on seven real-world benchmarks to verify the effectiveness of Hidformer. The experimental results show that Hidformer achieves 72 top-1 and 69 top-2 scores out of 88 configurations. It dramatically improves the prediction accuracy and outperforms the previous state-of-the-art, proving the superiority of our proposed method.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
小哈完成签到 ,获得积分10
刚刚
Sakura完成签到,获得积分10
刚刚
星辰大海应助啃猫爪采纳,获得10
刚刚
老妖怪完成签到,获得积分10
1秒前
二大爷完成签到,获得积分20
1秒前
bc完成签到,获得积分10
2秒前
心灵美的宛丝完成签到,获得积分10
2秒前
Zoe完成签到,获得积分10
2秒前
这不得行完成签到 ,获得积分10
2秒前
找文献完成签到,获得积分20
2秒前
小王完成签到,获得积分10
3秒前
不才发布了新的文献求助10
3秒前
honey发布了新的文献求助10
3秒前
任性的白玉完成签到 ,获得积分10
3秒前
Mxue完成签到,获得积分10
3秒前
笨考拉完成签到,获得积分10
4秒前
4秒前
科目三应助明理夏槐采纳,获得10
4秒前
5秒前
L1230发布了新的文献求助10
5秒前
蓝冰完成签到,获得积分10
6秒前
晚风完成签到,获得积分10
6秒前
鱼会淹死吗完成签到,获得积分10
6秒前
SUKAAAA完成签到,获得积分10
6秒前
陈远青发布了新的文献求助10
7秒前
俭朴士晋发布了新的文献求助10
8秒前
科研狗发布了新的文献求助10
8秒前
ncuwzq完成签到,获得积分10
9秒前
9秒前
Jonny完成签到,获得积分10
9秒前
wanci应助热情蜜蜂采纳,获得10
9秒前
脑洞疼应助xuexi采纳,获得10
10秒前
量子星尘发布了新的文献求助10
10秒前
10秒前
11秒前
siwen完成签到,获得积分10
12秒前
12秒前
12秒前
xywneg_cn完成签到,获得积分10
12秒前
可乐不加冰完成签到,获得积分10
12秒前
高分求助中
The Mother of All Tableaux Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 2400
Ophthalmic Equipment Market by Devices(surgical: vitreorentinal,IOLs,OVDs,contact lens,RGP lens,backflush,diagnostic&monitoring:OCT,actorefractor,keratometer,tonometer,ophthalmoscpe,OVD), End User,Buying Criteria-Global Forecast to2029 2000
Optimal Transport: A Comprehensive Introduction to Modeling, Analysis, Simulation, Applications 800
Official Methods of Analysis of AOAC INTERNATIONAL 600
ACSM’s Guidelines for Exercise Testing and Prescription, 12th edition 588
A new approach to the extrapolation of accelerated life test data 500
T/CIET 1202-2025 可吸收再生氧化纤维素止血材料 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3953688
求助须知:如何正确求助?哪些是违规求助? 3499494
关于积分的说明 11095814
捐赠科研通 3230038
什么是DOI,文献DOI怎么找? 1785859
邀请新用户注册赠送积分活动 869602
科研通“疑难数据库(出版商)”最低求助积分说明 801479