遗忘
计算机科学
集合预报
人工智能
人工神经网络
机器学习
时间序列
循环神经网络
训练集
培训(气象学)
渐进式学习
集成学习
哲学
语言学
物理
气象学
作者
Huiju Wang,Mengxuan Li,Yue Xiao
标识
DOI:10.1016/j.compeleceng.2021.107156
摘要
Long short-term memory (LSTM) is one of the most widely used recurrent neural network. Traditionally, it adopts an offline batch mode for model training. To be updated with new data, the network has to be re-trained with merged data using both old and new data, which is very time-consuming and causes catastrophic forgetting. To address this issue, we proposed an incremental ensemble LSTM model-IncLSTM, which fuses ensemble learning and transfer learning to implement incremental updating of the model. The experimental results showed that, in average, the proposed method decreases training time by 18.8%, and improves the prediction accuracy by 15.6% compared with the traditional methods. More importantly, the larger the training data size is, the more efficient IncLSTM would be. While updating the new model, current model predicts independently and concurrently, and the switch between current model and new model occurs once the update is completed, which significantly improves the training efficiency of the model.
科研通智能强力驱动
Strongly Powered by AbleSci AI