亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Expanding the prediction capacity in long sequence time-series forecasting

序列(生物学) 变压器 计算机科学 编码器 推论 人工智能 依赖关系(UML) 系列(地层学) 算法 工程类 电压 遗传学 生物 操作系统 电气工程 古生物学
作者
Haoyi Zhou,Jianxin Li,Shanghang Zhang,Shuai Zhang,Mengyi Yan,Hui Xiong
出处
期刊:Artificial Intelligence [Elsevier BV]
卷期号:318: 103886-103886 被引量:30
标识
DOI:10.1016/j.artint.2023.103886
摘要

Many real-world applications show growing demand for the prediction of long sequence time-series, such as electricity consumption planning. Long sequence time-series forecasting (LSTF) requires a higher prediction capacity of the model, which is the ability to capture precise long-range dependency coupling between output and input efficiently. Recent studies have shown the potential of Transformer to accommodate the capacity requirements. However, three real challenges that may have prevented expanding the prediction capacity in LSTF are that the Transformer is limited by quadratic time complexity, high memory usage, and slow inference speed under the encoder-decoder architecture. To address these issues, we design an efficient transformer-based model for LSTF, named Informer, with three distinctive characteristics. (i) a ProbSparse self-attention mechanism, which achieves O(Llog⁡L) in time complexity and memory usage, and has comparable performance on sequences' dependency alignment. (ii) the self-attention distilling promotes dominating attention by convolutional operators. Besides, the halving of layer width is intended to reduce the expense of building a deeper network on extremely long input sequences. (iii) the generative style decoder, while conceptually simple, predicts the long time-series sequences at one forward operation rather than a step-by-step way, which drastically improves the inference speed of long-sequence predictions. Extensive experiments on ten large-scale datasets demonstrate that Informer significantly outperforms existing methods and provides a new solution to the LSTF problem.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
吃了吃了完成签到,获得积分10
2秒前
7秒前
9秒前
131949发布了新的文献求助10
14秒前
小蜻蜓应助科研通管家采纳,获得30
14秒前
herococa应助科研通管家采纳,获得10
14秒前
15秒前
量子星尘发布了新的文献求助10
19秒前
20秒前
斯文败类应助熬夜的小王采纳,获得10
24秒前
ding应助高挑的沛蓝采纳,获得10
26秒前
朱朱子完成签到 ,获得积分10
31秒前
anagenesis完成签到,获得积分10
48秒前
qiu发布了新的文献求助10
52秒前
小俊完成签到,获得积分10
1分钟前
don完成签到 ,获得积分10
1分钟前
可达鸭应助131949采纳,获得10
1分钟前
科研通AI5应助13654135090采纳,获得30
1分钟前
131949完成签到,获得积分20
1分钟前
腼腆钵钵鸡完成签到 ,获得积分10
1分钟前
量子星尘发布了新的文献求助10
1分钟前
鳎mu完成签到,获得积分10
1分钟前
1分钟前
1分钟前
鳎mu发布了新的文献求助10
1分钟前
we发布了新的文献求助10
2分钟前
Ddz完成签到,获得积分10
2分钟前
yznfly应助鳎mu采纳,获得30
2分钟前
2分钟前
小蜻蜓应助科研通管家采纳,获得10
2分钟前
2分钟前
烟花应助归陌采纳,获得10
2分钟前
13654135090发布了新的文献求助30
2分钟前
2分钟前
苏silence发布了新的文献求助10
2分钟前
风华正茂完成签到,获得积分20
2分钟前
量子星尘发布了新的文献求助10
2分钟前
苏silence发布了新的文献求助10
3分钟前
3分钟前
高分求助中
The Mother of All Tableaux Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 2400
Ophthalmic Equipment Market by Devices(surgical: vitreorentinal,IOLs,OVDs,contact lens,RGP lens,backflush,diagnostic&monitoring:OCT,actorefractor,keratometer,tonometer,ophthalmoscpe,OVD), End User,Buying Criteria-Global Forecast to2029 2000
Optimal Transport: A Comprehensive Introduction to Modeling, Analysis, Simulation, Applications 800
Official Methods of Analysis of AOAC INTERNATIONAL 600
ACSM’s Guidelines for Exercise Testing and Prescription, 12th edition 588
A Preliminary Study on Correlation Between Independent Components of Facial Thermal Images and Subjective Assessment of Chronic Stress 500
T/CIET 1202-2025 可吸收再生氧化纤维素止血材料 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3957025
求助须知:如何正确求助?哪些是违规求助? 3503031
关于积分的说明 11111158
捐赠科研通 3234068
什么是DOI,文献DOI怎么找? 1787710
邀请新用户注册赠送积分活动 870728
科研通“疑难数据库(出版商)”最低求助积分说明 802250