计算机科学
变压器
建筑
短时记忆
人工智能
机器学习
循环神经网络
工程类
人工神经网络
地理
电压
考古
电气工程
作者
Musleh Alharthi,Ausif Mahmood
出处
期刊:AI
[MDPI AG]
日期:2024-08-23
卷期号:5 (3): 1482-1495
被引量:1
摘要
In recent years, transformer-based models have gained prominence in multivariate long-term time series forecasting (LTSF), demonstrating significant advancements despite facing challenges such as high computational demands, difficulty in capturing temporal dynamics, and managing long-term dependencies. The emergence of LTSF-Linear, with its straightforward linear architecture, has notably outperformed transformer-based counterparts, prompting a reevaluation of the transformer’s utility in time series forecasting. In response, this paper presents an adaptation of a recent architecture, termed extended LSTM (xLSTM), for LTSF. xLSTM incorporates exponential gating and a revised memory structure with higher capacity that has good potential for LTSF. Our adopted architecture for LTSF, termed xLSTMTime, surpasses current approaches. We compare xLSTMTime’s performance against various state-of-the-art models across multiple real-world datasets, demonstrating superior forecasting capabilities. Our findings suggest that refined recurrent architectures can offer competitive alternatives to transformer-based models in LTSF tasks, potentially redefining the landscape of time series forecasting.
科研通智能强力驱动
Strongly Powered by AbleSci AI