序列(生物学)
变压器
计算机科学
编码器
推论
人工智能
依赖关系(UML)
系列(地层学)
算法
工程类
电压
遗传学
生物
电气工程
操作系统
古生物学
作者
Haoyi Zhou,Jianxin Li,Li Du,Shuai Zhang,Mengyi Yan,Hui Xiong
标识
DOI:10.1016/j.artint.2023.103886
摘要
Many real-world applications show growing demand for the prediction of long sequence time-series, such as electricity consumption planning. Long sequence time-series forecasting (LSTF) requires a higher prediction capacity of the model, which is the ability to capture precise long-range dependency coupling between output and input efficiently. Recent studies have shown the potential of Transformer to accommodate the capacity requirements. However, three real challenges that may have prevented expanding the prediction capacity in LSTF are that the Transformer is limited by quadratic time complexity, high memory usage, and slow inference speed under the encoder-decoder architecture. To address these issues, we design an efficient transformer-based model for LSTF, named Informer, with three distinctive characteristics. (i) a ProbSparse self-attention mechanism, which achieves O(LlogL) in time complexity and memory usage, and has comparable performance on sequences' dependency alignment. (ii) the self-attention distilling promotes dominating attention by convolutional operators. Besides, the halving of layer width is intended to reduce the expense of building a deeper network on extremely long input sequences. (iii) the generative style decoder, while conceptually simple, predicts the long time-series sequences at one forward operation rather than a step-by-step way, which drastically improves the inference speed of long-sequence predictions. Extensive experiments on ten large-scale datasets demonstrate that Informer significantly outperforms existing methods and provides a new solution to the LSTF problem.
科研通智能强力驱动
Strongly Powered by AbleSci AI