可解释性
可预测性
时间序列
系列(地层学)
计算机科学
多层感知器
变压器
数据挖掘
人工智能
人工神经网络
机器学习
数学
统计
古生物学
电压
物理
量子力学
生物
作者
Jiashan Wan,Na Xia,Yutao Yin,Xulei Pan,Jin Hu,Yi Jun
出处
期刊:Neural Networks
[Elsevier]
日期:2024-02-23
卷期号:173: 106196-106196
被引量:5
标识
DOI:10.1016/j.neunet.2024.106196
摘要
Although time series prediction models based on Transformer architecture have achieved significant advances, concerns have arisen regarding their performance with non-stationary real-world data. Traditional methods often use stabilization techniques to boost predictability, but this often results in the loss of non-stationarity, notably underperforming when tackling major events in practical applications. To address this challenge, this research introduces an innovative method named TCDformer (Trend and Change-point Detection Transformer). TCDformer employs a unique strategy, initially encoding abrupt changes in non-stationary time series using the local linear scaling approximation (LLSA) module. The reconstructed contextual time series is then decomposed into trend and seasonal components. The final prediction results are derived from the additive combination of a multilayer perceptron (MLP) for predicting trend components and wavelet attention mechanisms for seasonal components. Comprehensive experimental results show that on standard time series prediction datasets, TCDformer significantly surpasses existing benchmark models in terms of performance, reducing MSE by 47.36% and MAE by 31.12%. This approach offers an effective framework for managing non-stationary time series, achieving a balance between performance and interpretability, making it especially suitable for addressing non-stationarity challenges in real-world scenarios.
科研通智能强力驱动
Strongly Powered by AbleSci AI