计算机科学
变压器
时间序列
系列(地层学)
电气工程
电压
工程类
机器学习
地质学
古生物学
作者
Xihao Piao,Zheng Chen,Taichi Murayama,Yasuko Matsubara,Yasushi Sakurai
标识
DOI:10.1145/3637528.3671928
摘要
The Transformer model has shown leading performance in time series forecasting. Nevertheless, in some complex scenarios, it tends to learn low-frequency features in the data and overlook high-frequency features, showing a frequency bias. This bias prevents the model from accurately capturing important high-frequency data features. In this paper, we undertake empirical analyses to understand this bias and discover that frequency bias results from the model disproportionately focusing on frequency features with higher energy. Based on our analysis, we formulate this bias and propose Fredformer, a Transformer-based framework designed to mitigate frequency bias by learning features equally across different frequency bands. This approach prevents the model from overlooking lower amplitude features important for accurate forecasting. Extensive experiments show the effectiveness of our proposed approach, which can outperform other baselines in different real-world time-series datasets. Furthermore, we introduce a lightweight variant of the Fredformer with an attention matrix approximation, which achieves comparable performance but with much fewer parameters and lower computation costs. The code is available at: https://github.com/chenzRG/Fredformer
科研通智能强力驱动
Strongly Powered by AbleSci AI