【TEVC 2023】综述:可学习的进化算法在可扩展多目标优化中的综述

Multiobjective evolutionary algorithms (MOEAs) 

Multiobjective optimization problems (MOPs)
 

Fig. 1. The general flow of an MOEA from the perspective of ML

From the perspective of ML(evolutionary machine learning) as shown in Fig. 1, an MOEA’s generator aims to reproduce offspring with superior qualities over their parents by evolution,

while its discriminator aims to distinguish the qualities between parent and offspring solutions after the evaluator, followed by selecting a certain number of elites to the next generation.

1) General definition and taxonomy of scaling-up MOPs and learnable MOEAs;
2) Efforts on learnable evolutionary discriminators for handling the MOPs with a scaling-up objective space, i.e., many-objective optimization problems (MaOPs);
3) Efforts on learnable evolutionary generators for solving the MOPs with a scaling-up search space, i.e., largescale MOPs (LMOPs);
4) Efforts on learnable evaluators to alleviate the challenges posed by MOPs with the scaling-up cost of function evaluations, i.e., expensive MOPs (EMOPs);
5) Efforts on learnable evolutionary transfer modules for finding a shortcut that can improve the efficacy when  optimizing a scaling-up number of different MOPs sequentially or simultaneously, i.e., sequential MOPs (SMOPs) and multitasking MOPs (MMOPs).

1)对于扩展的多目标优化问题(MOPs)和可学习的多目标进化算法(MOEAs)进行了一般性定义和分类;
2)针对具有扩展目标空间的MOPs,即多目标优化问题(MaOPs),进行了基于学习的进化鉴别器的研究;
3)针对具有扩展搜索空间的MOPs,即大规模多目标优化问题(LMOPs),进行了基于学习的进化生成器的研究;
4)针对具有扩展函数评估代价的MOPs,即昂贵多目标优化问题(EMOPs),进行了基于学习的评估器的研究,以缓解其带来的挑战;
5)针对顺序MOPs(SMOPs)和多任务MOPs(MMOPs)进行了基于学习的进化转移模块的研究,以找到可以提高效率的捷径,从而在连续或同时优化扩展数量的不同MOPs时获得更好的效果。

猜你喜欢

转载自blog.csdn.net/weixin_43135178/article/details/130928741