计算机科学
变压器
合并(版本控制)
人工智能
机器学习
电压
物理
量子力学
情报检索
作者
Zhaoran Liu,Yizhi Cao,Hu Xu,Yuxin Huang,Qunshan He,Xinjie Chen,Xiaoyu Tang,Xinggao Liu
标识
DOI:10.1016/j.eswa.2023.122412
摘要
Long-term time series forecasting has received a lot of popularity because of its great practicality. It is also an extremely challenging task since it requires using limited observations to predict values in the long future accurately. Recent works have demonstrated that Transformer has strong potential for this task. However, the permutation-invariant property of the Transformer and some other prominent shortcomings in the current Transformer-based models, such as missing multi-scale local features and information from the frequency domain, significantly limit their performance. To improve the accuracy of the long-term time series forecasting, we propose a Transformer-based model called Hidformer. This model can either learn temporal dynamics from the time domain or discover particular patterns from the frequency domain. We also design a segment-and-merge architecture to provide semantic meanings for the inputs and help the model capture multi-scale local features. Besides, we replace Transformer's multi-head attention with highly-efficient recurrence and linear attention, which gives our model an advantage over other Transformer-based models in terms of computational efficiency. Extensive experiments are conducted on seven real-world benchmarks to verify the effectiveness of Hidformer. The experimental results show that Hidformer achieves 72 top-1 and 69 top-2 scores out of 88 configurations. It dramatically improves the prediction accuracy and outperforms the previous state-of-the-art, proving the superiority of our proposed method.
科研通智能强力驱动
Strongly Powered by AbleSci AI