过度拟合
学习迁移
计算机科学
深度学习
机器学习
人工智能
期限(时间)
预测建模
水质
多任务学习
数据挖掘
任务(项目管理)
人工神经网络
工程类
量子力学
生物
生态学
物理
系统工程
作者
Peng Lin,Huan Wu,Min Gao,Hualing Yi,Qingyu Xiong,Yanyan Yang,Shuiping Cheng
出处
期刊:Water Research
[Elsevier]
日期:2022-09-29
卷期号:225: 119171-119171
被引量:29
标识
DOI:10.1016/j.watres.2022.119171
摘要
The water quality long-term prediction is essential to water environment management decisions. In recent years, although water quality prediction methods based on deep learning have achieved excellent performance in short-term prediction, these methods are unsuitable for long-term prediction because the accumulation use of short-term prediction will easily introduce noise. Furthermore, The long-term prediction task requires a large amount of data to train the model to obtain accurate prediction results. For some monitoring stations with limited historical data, it is challenging to fully exploit the performance of deep learning models. To this end, we introduce a transfer learning framework into water quality prediction to improve the prediction performance in data-constrained scenarios. We propose a deep Transfer Learning based on Transformer (TLT) model to enable time dependency perception and facilitate long-term water quality prediction. In TLT, we innovatively introduce a recurrent fine-tuning transfer learning method, which can transfer the knowledge learned from source monitoring stations to the target station, while preventing the deep learning model from overfitting the source data during the pre-training phase. So, TLT can fully exert the performance of deep learning models with limited samples. We conduct experiments on data from 120 monitoring stations in major rivers and lakes in China to verify the effectiveness of TLT. The results show that TLT can effectively improve the long-term prediction accuracy of four water quality indicators (pH, DO, NH3-N, and CODMn) from monitoring stations with limited samples.
科研通智能强力驱动
Strongly Powered by AbleSci AI