瓶颈
计算机科学
嵌入
计算
序列(生物学)
时间序列
算法
系列(地层学)
时间序列
变压器
数据挖掘
人工智能
机器学习
工程类
生物
电气工程
嵌入式系统
古生物学
电压
遗传学
作者
Savong Bou,Toshiyuki Amagasa,Hiroyuki Kitagawa
标识
DOI:10.1007/978-3-031-12426-6_4
摘要
Predicting time-series data is useful in many applications, such as natural disaster prevention system, weather forecast, traffic control system, etc. Time-series forecasting has been extensively studied. Many existing forecasting models tend to perform well when predicting short sequence time-series. However, their performances greatly degrade when dealing with the long one. Recently, more dedicated research has been done for this direction, and Informer is currently the most efficient predicting model. The main drawback of Informer is the inability to incrementally learn. This paper proposes an incremental Transformer, called InTrans, to address the above bottleneck by reducing the training/predicting time of Informer. The time complexities of InTrans comparing to the Informer are: (1) O(S) vs O(L) for positional and temporal embedding, (2) $$O((S+k-1)*k)$$ vs $$O(L*k)$$ for value embedding, and (3) $$O((S+k-1)*d_{dim})$$ vs $$O(L*d_{dim})$$ for the computation of Query/Key/Value, where L is the length of the input; k is the kernel size; $$d_{dim}$$ is the number of dimensions; and S is the length of the non-overlapping part of the input that is usually significantly smaller than L. Therefore, InTrans could greatly improve both training and predicting speed over the state-of-the-art model, Informer. Extensive experiments have shown that InTrans is about 26% faster than Informer for both short sequence and long sequence time-series prediction.
科研通智能强力驱动
Strongly Powered by AbleSci AI