已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

Hidformer: Hierarchical dual-tower transformer using multi-scale mergence for long-term time series forecasting

计算机科学 变压器 合并(版本控制) 人工智能 机器学习 电压 情报检索 量子力学 物理
作者
Zhaoran Liu,Yizhi Cao,Hu Xu,Yuxin Huang,Qunshan He,Xinjie Chen,Xiaoyu Tang,Xinggao Liu
出处
期刊:Expert Systems With Applications [Elsevier]
卷期号:239: 122412-122412 被引量:27
标识
DOI:10.1016/j.eswa.2023.122412
摘要

Long-term time series forecasting has received a lot of popularity because of its great practicality. It is also an extremely challenging task since it requires using limited observations to predict values in the long future accurately. Recent works have demonstrated that Transformer has strong potential for this task. However, the permutation-invariant property of the Transformer and some other prominent shortcomings in the current Transformer-based models, such as missing multi-scale local features and information from the frequency domain, significantly limit their performance. To improve the accuracy of the long-term time series forecasting, we propose a Transformer-based model called Hidformer. This model can either learn temporal dynamics from the time domain or discover particular patterns from the frequency domain. We also design a segment-and-merge architecture to provide semantic meanings for the inputs and help the model capture multi-scale local features. Besides, we replace Transformer's multi-head attention with highly-efficient recurrence and linear attention, which gives our model an advantage over other Transformer-based models in terms of computational efficiency. Extensive experiments are conducted on seven real-world benchmarks to verify the effectiveness of Hidformer. The experimental results show that Hidformer achieves 72 top-1 and 69 top-2 scores out of 88 configurations. It dramatically improves the prediction accuracy and outperforms the previous state-of-the-art, proving the superiority of our proposed method.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
cheng完成签到 ,获得积分10
3秒前
3秒前
Als完成签到,获得积分10
3秒前
打打应助大抵是能上岸的采纳,获得10
4秒前
田様应助火山啊啊啊采纳,获得10
4秒前
4秒前
6秒前
努力学习的阿文完成签到,获得积分10
8秒前
魔幻的从灵完成签到 ,获得积分10
10秒前
11秒前
李健应助科研通管家采纳,获得10
11秒前
领导范儿应助科研通管家采纳,获得10
11秒前
浮游应助科研通管家采纳,获得10
11秒前
无花果应助科研通管家采纳,获得10
11秒前
科研通AI6应助科研通管家采纳,获得10
11秒前
12秒前
英姑应助科研通管家采纳,获得10
12秒前
浮游应助科研通管家采纳,获得10
12秒前
12秒前
Owen应助早早采纳,获得10
12秒前
14秒前
顺利的天真完成签到,获得积分10
14秒前
可爱的函函应助july九月采纳,获得10
17秒前
18秒前
19秒前
ztlaky发布了新的文献求助10
19秒前
19秒前
20秒前
Rocc完成签到,获得积分10
21秒前
Andone完成签到,获得积分10
21秒前
科研互通完成签到,获得积分10
21秒前
所所应助vvkkk采纳,获得10
22秒前
xjh关注了科研通微信公众号
22秒前
adaniu完成签到,获得积分10
23秒前
秀丽的友灵完成签到,获得积分20
24秒前
香蕉觅云应助Joon采纳,获得10
25秒前
上官若男应助ztlaky采纳,获得10
26秒前
仔仔完成签到,获得积分10
26秒前
26秒前
高分求助中
Aerospace Standards Index - 2025 10000
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Clinical Microbiology Procedures Handbook, Multi-Volume, 5th Edition 1000
Teaching Language in Context (Third Edition) 1000
List of 1,091 Public Pension Profiles by Region 941
流动的新传统主义与新生代农民工的劳动力再生产模式变迁 500
Historical Dictionary of British Intelligence (2014 / 2nd EDITION!) 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5443487
求助须知:如何正确求助?哪些是违规求助? 4553360
关于积分的说明 14241701
捐赠科研通 4475034
什么是DOI,文献DOI怎么找? 2452187
邀请新用户注册赠送积分活动 1443165
关于科研通互助平台的介绍 1418774