TS-Fastformer: Fast Transformer for Time-series Forecasting

计算机科学 瓶颈 变压器 推论 编码器 时间序列 人工智能 深度学习 系列(地层学) 机器学习 电压 物理 量子力学 嵌入式系统 操作系统 古生物学 生物
作者
Sangwon Lee,Junho Hong,Ling Liu,Wonik Choi
出处
期刊:ACM Transactions on Intelligent Systems and Technology [Association for Computing Machinery]
卷期号:15 (2): 1-20 被引量:7
标识
DOI:10.1145/3630637
摘要

Many real-world applications require precise and fast time-series forecasting. Recent trends in time-series forecasting models are shifting from LSTM-based models to Transformer-based models. However, the Transformer-based model has a limited ability to represent sequential relationships in time-series data. In addition, the transformer-based model suffers from slow training and inference speed due to the bottleneck incurred by a deep encoder and step-by-step decoder inference. To address these problems, we propose a time-series forecasting optimized Transformer model, called TS-Fastformer. TS-Fastformer introduces three new optimizations: First, we propose a Sub Window Tokenizer for compressing input in a simple manner. The Sub Window Tokenizer reduces the length of input sequences to mitigate the complexity of self-attention and enables both single and multi-sequence learning. Second, we propose Time-series Pre-trained Encoder to extract effective representations through pre-training. This optimization enables TS-Fastformer to capture both seasonal and trend representations as well as to mitigate bottlenecks of conventional transformer models. Third, we propose the Past Attention Decoder to forecast target by incorporating past long short-term dependency patterns. Furthermore, Past Attention Decoder achieves high performance improvement by removing a trend distribution that changes over a long period. We evaluate the efficiency of our model with extensive experiments using seven real-world datasets and compare our model to six representative time-series forecasting approaches. The results show that the proposed TS-Fastformer reduces MSE by 10.1% compared to state-of-the-art model and demonstrates 21.6% faster training time compared to the existing fastest transformer, respectively.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
wtf完成签到,获得积分20
刚刚
1秒前
等等完成签到,获得积分10
2秒前
XIAOWANG完成签到,获得积分10
3秒前
Zzz完成签到,获得积分10
3秒前
稳重的若雁应助略略略采纳,获得10
4秒前
Akim应助活力的果汁采纳,获得10
5秒前
5秒前
Owen应助松鼠15111采纳,获得10
5秒前
梧桐发布了新的文献求助10
5秒前
大脸猫4811发布了新的文献求助10
5秒前
共享精神应助节节高采纳,获得10
5秒前
Lucas应助chens627采纳,获得30
6秒前
鱿鱼完成签到,获得积分10
6秒前
6秒前
7秒前
7秒前
甜蜜帽子发布了新的文献求助10
8秒前
XuX完成签到,获得积分10
9秒前
从从余余完成签到,获得积分10
10秒前
犹豫的初丹完成签到,获得积分10
10秒前
愤怒的超级兵完成签到,获得积分10
11秒前
隐形曼青应助鱼的宇宙采纳,获得10
11秒前
12秒前
yeyuchenfeng发布了新的文献求助10
12秒前
欢歌笑语完成签到,获得积分10
12秒前
喻箴发布了新的文献求助10
13秒前
ksak607155完成签到,获得积分10
13秒前
从从余余发布了新的文献求助10
13秒前
脑洞疼应助文艺往事采纳,获得10
14秒前
科研通AI2S应助天南采纳,获得10
14秒前
14秒前
zhangshaoqi发布了新的文献求助10
15秒前
16秒前
16秒前
仁者无敌完成签到,获得积分10
17秒前
沐雨完成签到,获得积分20
18秒前
友好的莺完成签到,获得积分10
18秒前
我是老大应助zlf采纳,获得10
18秒前
orixero应助大脸猫4811采纳,获得10
18秒前
高分求助中
Evolution 10000
Sustainability in Tides Chemistry 2800
The Young builders of New china : the visit of the delegation of the WFDY to the Chinese People's Republic 1000
юрские динозавры восточного забайкалья 800
English Wealden Fossils 700
Diagnostic immunohistochemistry : theranostic and genomic applications 6th Edition 500
Chen Hansheng: China’s Last Romantic Revolutionary 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3148410
求助须知:如何正确求助?哪些是违规求助? 2799502
关于积分的说明 7835226
捐赠科研通 2456813
什么是DOI,文献DOI怎么找? 1307424
科研通“疑难数据库(出版商)”最低求助积分说明 628189
版权声明 601655