进化算法
计算机科学
替代模型
数学优化
多目标优化
优化算法
算法
人工智能
机器学习
数学
作者
Jeng‐Shyang Pan,An-Ning Zhang,Shu‐Chuan Chu,Jia Zhao,Václav Snåšel
标识
DOI:10.1016/j.asoc.2024.111967
摘要
Addressing expensive many-objective optimization problems (MaOPs) is a formidable challenge owing to their intricate objective spaces and high computational demands. Surrogate-assisted evolutionary algorithms (SAEAs) have gained prominence because of their ability to tackle MaOPs efficiently. They achieve this by using surrogate models to approximate objective functions, significantly reducing their reliance on costly evaluations. However, the effectiveness of many SAEAs is hampered by their reliance on various surrogate models and optimization strategies, which often result in suboptimal prediction accuracy and optimization performance. This study introduces a novel approach: an activity level based surrogate-assisted reference vector guided evolutionary algorithm specifically designed for expensive MaOPs. Utilizing the Kriging model and an angle penalty distance criterion, this algorithm effectively filters solutions that require evaluation using the original function. It employs a fixed number of training sets,that are updated via a two-screening strategy that leverages activity levels to refine population screening. This process ensures that the reference vector progressively aligns more closely with the Pareto fronts,which is enhanced by the deployment of adjusted adaptive reference vectors, thereby improving the screening precision. The proposed algorithm was tested against six contemporary algorithms using the DTLZ, WFG, and MaF test suites.The experimental results show that the proposed method outperforms other algorithms in most problems. Furthermore, its application to the cloud computing task scheduling problem underscores its practical value, demonstrating its notable effectiveness. The experimental outcomes attest to the robust performance of the algorithm across both test scenarios and real-world applications.
科研通智能强力驱动
Strongly Powered by AbleSci AI