帕累托原理
数学优化
遗传算法
多目标优化
计算机科学
采样(信号处理)
算法
数学
滤波器(信号处理)
计算机视觉
作者
Qi Luo,Jianfeng Wu,Yun Yang,Jiazhong Qian,Jichun Wu
标识
DOI:10.1016/j.jhydrol.2016.01.009
摘要
Optimal design of long term groundwater monitoring (LTGM) network often involves conflicting objectives and substantial uncertainty arising from insufficient hydraulic conductivity (K) data. This study develops a new multi-objective simulation–optimization model involving four objectives: minimizations of (i) the total sampling costs for monitoring contaminant plume, (ii) mass estimation error, (iii) the first moment estimation error, and (iv) the second moment estimation error of the contaminant plume, for LTGM network design problems. Then a new probabilistic Pareto genetic algorithm (PPGA) coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, is developed to search for the Pareto-optimal solutions to the multi-objective LTGM problems under uncertainty of the K-fields. The PPGA integrates the niched Pareto genetic algorithm with probabilistic Pareto sorting scheme to deal with the uncertainty of objectives caused by the uncertain K-field. Also, the elitist selection strategy, the operation library and the Pareto solution set filter are conducted to improve the diversity and reliability of Pareto-optimal solutions by the PPGA. Furthermore, the sampling strategy of noisy genetic algorithm is adopted to cope with the uncertainty of the K-fields and improve the computational efficiency of the PPGA. In particular, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology in finding Pareto-optimal sampling network designs of LTGM systems through a two-dimensional hypothetical example and a three-dimensional field application in Indiana (USA). Comprehensive analysis demonstrates that the proposed PPGA can find Pareto optimal solutions with low variability and high reliability and is a promising tool for optimizing multi-objective LTGM network designs under uncertainty.
科研通智能强力驱动
Strongly Powered by AbleSci AI