数学优化
启发式
计算机科学
局部最优
稳健性(进化)
算法
数学
生物化学
基因
化学
作者
Youwei Qin,Dmitri Kavetski,George Kuczera
摘要
Abstract Model calibration using optimization algorithms is a perennial challenge in hydrological modeling. This study explores opportunities to improve the efficiency of a Newton‐type method by making it more robust against problematic features in models' objective functions, including local optima and other noise. We introduce the robust Gauss‐Newton (RGN) algorithm for least squares optimization, which employs three heuristic schemes to enhance its exploratory abilities while keeping costs low. The large sampling scale (LSS) scheme is a central difference approximation with perturbation ( sampling scale ) made as large as possible to capture the overall objective function shape; the best‐sampling point (BSP) scheme exploits known function values to detect better parameter locations; and the null‐space jump (NSJ) scheme attempts to escape near‐flat regions. The RGN heuristics are evaluated using a case study comprising four hydrological models and three catchments. The heuristics make synergistic contributions to overall efficiency: the LSS scheme substantially improves reliability albeit at the expense of increased costs, and scenarios where LSS on its own is ineffective are bolstered by the BSP and NSJ schemes. In 11 of 12 modeling scenarios, RGN is 1.4–18 times more efficient in finding the global optimum than the standard Gauss‐Newton algorithm; similar gains are made in finding tolerable optima. Importantly, RGN offers its largest gains when working with difficult objective functions. The empirical analysis provides insights into tradeoffs between robustness versus cost, exploration versus exploitation, and how to manage these tradeoffs to maximize optimization efficiency. In the companion paper, the RGN algorithm is benchmarked against industry standard optimization algorithms.
科研通智能强力驱动
Strongly Powered by AbleSci AI