计算机科学
人工智能
稳健性(进化)
时间序列
机器学习
域适应
深度学习
数据挖掘
学习迁移
领域(数学分析)
卷积神经网络
数据建模
分类器(UML)
数据库
数学分析
生物化学
化学
数学
基因
作者
Yuan Li,Jingwei Li,Chengbao Liu,Jie Tan
标识
DOI:10.1007/978-3-031-36822-6_19
摘要
Time series forecasting is an essential problem involving many fields. Recently, with the development of big data technology, deep learning methods have been widely studied and achieved promising performance in time series forecasting tasks. But there is a limited number of time series or observations per time series. In this case, a time series forecasting model, which is based on domain adaptation and shared attention (DA-SA), is proposed in this study. First, we employ Transformer architecture as the basic framework of our model. Then, we specially design a selectively shared attention module to transfer valuable information from the data-rich domain to the data-poor domain by inducing domain-invariant latent features (queries and keys) and retraining domain-specific features (values). Besides, convolutional neural network is introduced to incorporate local context into the self-attention mechanism and captures the short-term dependencies of data. Finally, adversarial training is utilized to enhance the robustness of the model and improve prediction accuracy. The practicality and effectiveness of DA-SA for time series forecasting are verified on real-world datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI