加权
地表径流
反距离权重法
计算机科学
算法
卷积神经网络
时间序列
人工智能
数据挖掘
机器学习
计算机视觉
多元插值
生态学
双线性插值
放射科
生物
医学
作者
Zhiyuan Yao,Zhaocai Wang,Dangwei Wang,Tunhua Wu,Lingxuan Chen
标识
DOI:10.1016/j.jhydrol.2023.129977
摘要
Accurate prediction of river runoff is of great significance for water resources management, flood prevention and mitigation. The causes of runoff are complex and the mechanisms behind them are difficult to grasp. Building a data-driven deep learning model for runoff prediction is an effective solution. To achieve the fusion of multi-source information, prediction accuracy and wide applicability, a hybrid model based on CNN-LSTM & GRU-ISSA is proposed in this study. In this paper, meteorological data, hydrological data and runoff data are selected and the maximum information coefficient (MIC) is used to calculate the relationship between each variable and runoff in order to reduce the dimensionality of the data. A convolutional neural network (CNN) is used to extract features of the long time series of runoff and a long short-term memory network (LSTM) is used for the prediction of the long time series of runoff. A gated recurrent unit (GRU) is also used for the short time series prediction of runoff. In order to extract the advantages of both prediction models, an adaptive weighting module (AWM) is proposed to dynamically learn the outputs of both models and combine them into the final prediction results. To be able to solve the selection of model hyperparameters, we use the improved sparrow search algorithm (ISSA). This algorithm introduces two improvement points, Lévy Flight and Sine Cosine Algorithm, based on the Sparrow Search Algorithm (SSA), to achieve fast convergence of the algorithm with better global optimal solutions. The proposed model was validated using watersheds with different runoff ranges, e.g., in the Bailong River watershed, the NSE value was 0.90 and the RMSE value was 2.17. The results showed that the proposed model significantly outperformed the other baseline models.
科研通智能强力驱动
Strongly Powered by AbleSci AI