TCCT: Tightly-coupled convolutional transformer on time series forecasting

计算机科学 变压器 计算 地点 人工智能 算法 电压 工程类 电气工程 语言学 哲学
作者
Li Shen,Yangzhu Wang
出处
期刊:Neurocomputing [Elsevier]
卷期号:480: 131-145 被引量:68
标识
DOI:10.1016/j.neucom.2022.01.039
摘要

Time series forecasting is essential for a wide range of real-world applications. Recent studies have shown the superiority of Transformer in dealing with such problems, especially long sequence time series input (LSTI) and long sequence time series forecasting (LSTF) problems. To improve the efficiency and enhance the locality of Transformer, these studies combine Transformer with CNN in varying degrees. However, their combinations are loosely-coupled and do not make full use of CNN. To address this issue, we propose the concept of tightly-coupled convolutional Transformer (TCCT) and three TCCT architectures which apply transformed CNN architectures into Transformer: (1) CSPAttention: through fusing CSPNet with self-attention mechanism, the computation cost of self-attention mechanism is reduced by 30% and the memory usage is reduced by 50% while achieving equivalent or beyond prediction accuracy. (2) Dilated causal convolution: this method is to modify the distilling operation proposed by Informer through replacing canonical convolutional layers with dilated causal convolutional layers to gain exponentially receptive field growth. (3) Passthrough mechanism: the application of passthrough mechanism to stack of self-attention blocks helps Transformer-like models get more fine-grained information with negligible extra computation costs. Our experiments on real-world datasets show that our TCCT architectures could greatly improve the performance of existing state-of-the-art Transformer models on time series forecasting with much lower computation and memory costs, including canonical Transformer, LogTrans and Informer.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
student完成签到,获得积分10
1秒前
张张发布了新的文献求助10
1秒前
1秒前
大口吃榴莲关注了科研通微信公众号
2秒前
FashionBoy应助青青采纳,获得30
2秒前
大个应助11采纳,获得10
2秒前
3秒前
4秒前
15860936613完成签到 ,获得积分10
4秒前
smj发布了新的文献求助10
5秒前
5秒前
yxl发布了新的文献求助10
6秒前
森气发布了新的文献求助10
7秒前
8秒前
8秒前
8秒前
9秒前
10秒前
10秒前
10秒前
归尘应助lisa采纳,获得100
10秒前
10秒前
zxh发布了新的文献求助10
11秒前
蓝景轩辕完成签到 ,获得积分0
11秒前
Gilbert发布了新的文献求助10
12秒前
科研通AI2S应助滕皓轩采纳,获得10
13秒前
nsc发布了新的文献求助10
13秒前
chinzz应助天南星采纳,获得10
13秒前
14秒前
默弦发布了新的文献求助30
14秒前
shelly0621发布了新的文献求助10
14秒前
隐形曼青应助张张采纳,获得10
14秒前
joy发布了新的文献求助10
15秒前
11发布了新的文献求助10
15秒前
18秒前
20秒前
mkljl发布了新的文献求助10
20秒前
自觉从云发布了新的文献求助10
21秒前
紫金大萝卜应助Gilbert采纳,获得10
21秒前
CipherSage应助nsc采纳,获得10
21秒前
高分求助中
Востребованный временем 2500
Hopemont Capacity Assessment Interview manual and scoring guide 1000
Classics in Total Synthesis IV: New Targets, Strategies, Methods 1000
Neuromuscular and Electrodiagnostic Medicine Board Review 700
中介效应和调节效应模型进阶 400
Refractive Index Metrology of Optical Polymers 400
Progress in the development of NiO/MgO solid solution catalysts: A review 300
热门求助领域 (近24小时)
化学 医学 材料科学 生物 工程类 有机化学 生物化学 纳米技术 内科学 物理 化学工程 计算机科学 复合材料 基因 遗传学 物理化学 催化作用 细胞生物学 免疫学 电极
热门帖子
关注 科研通微信公众号,转发送积分 3443836
求助须知:如何正确求助?哪些是违规求助? 3039923
关于积分的说明 8979256
捐赠科研通 2728504
什么是DOI,文献DOI怎么找? 1496599
科研通“疑难数据库(出版商)”最低求助积分说明 691703
邀请新用户注册赠送积分活动 689273