期限(时间)
系列(地层学)
计算机科学
功率(物理)
时间序列
人工智能
机器学习
计量经济学
工业工程
数学
工程类
地质学
物理
古生物学
量子力学
作者
Yaxuan Kong,Zepu Wang,Yuqi Nie,Tian Zhou,Stefan Zohren,Yuxuan Liang,Peng Sun,Qingsong Wen
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2025-04-11
卷期号:39 (11): 11968-11976
标识
DOI:10.1609/aaai.v39i11.33303
摘要
Traditional recurrent neural network architectures, such as long short-term memory neural networks (LSTM), have historically held a prominent role in time series forecasting (TSF) tasks. While the recently introduced sLSTM for Natural Language Processing (NLP) introduces exponential gating and memory mixing that are beneficial for long term sequential learning, its potential short memory issue is a barrier to applying sLSTM directly in TSF. To address this, we propose a simple yet efficient algorithm named P-sLSTM, which is built upon sLSTM by incorporating patching and channel independence. These modifications substantially enhance sLSTM's performance in TSF, achieving state-of-the-art results. Furthermore, we provide theoretical justifications for our design, and conduct extensive comparative and analytical experiments to fully validate the efficiency and superior performance of our model.
科研通智能强力驱动
Strongly Powered by AbleSci AI