论文标题

基于成功的参数控制更容易

Hard Problems are Easier for Success-based Parameter Control

论文作者

Fajardo, Mario Alejandro Hevia, Sudholt, Dirk

论文摘要

最近的作品表明,在进化算法(EAS)中,简单的基于成功的自我调整参数的规则可以匹配或优于离散问题上最佳的固定参数。 A(1,$λ$)EA中的非精英主义与自我调整后的后代人口大小$λ$在多模式悬崖问题上的普遍EAS相结合。但是,结果表明,仅当管理自我调整的成功率$ s $足够小时,才能成立。否则,即使在Onemax上,自我调整(1,$λ$)的EA也会在一个简单的斜坡上停滞不前,频繁的成功推动了后代人口的规模。我们表明,在没有简单斜率的情况下,自我调整的目的是按意图进行的。我们定义了各个地方的硬功能,对于成功的成功(1,$λ$)EA在选择成功率$ s $方面的自我调整(1,$λ$)都很强大。我们在评估的数量上给出了一般的健身水平上限,并表明,预期的几代人数最多是$ o(d + \ log(1/p _ {\ min}))$,其中$ d $是$ p _ {\ min} $的非最佳健身价值的数量,是从搜索中找到的最小可能性。我们讨论了对无处不在的努力领导者的含义,以及在无处不在的艰难功能上的新阶级onemaxblocks。

Recent works showed that simple success-based rules for self-adjusting parameters in evolutionary algorithms (EAs) can match or outperform the best fixed parameters on discrete problems. Non-elitism in a (1,$λ$) EA combined with a self-adjusting offspring population size $λ$ outperforms common EAs on the multimodal Cliff problem. However, it was shown that this only holds if the success rate $s$ that governs self-adjustment is small enough. Otherwise, even on OneMax, the self-adjusting (1,$λ$) EA stagnates on an easy slope, where frequent successes drive down the offspring population size. We show that self-adjustment works as intended in the absence of easy slopes. We define everywhere hard functions, for which successes are never easy to find and show that the self-adjusting (1,$λ$) EA is robust with respect to the choice of success rates $s$. We give a general fitness-level upper bound on the number of evaluations and show that the expected number of generations is at most $O(d + \log(1/p_{\min}))$ where $d$ is the number of non-optimal fitness values and $p_{\min}$ is the smallest probability of finding an improvement from a non-optimal search point. We discuss implications for the everywhere hard function LeadingOnes and a new class OneMaxBlocks of everywhere hard functions with tunable difficulty.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源