FOGA 2017: FOUNDATIONS OF GENETIC ALGORITHMS XIV
PROGRAM FOR SUNDAY, JANUARY 15TH
Days:
previous day
all days

View: session overviewtalk overview

09:00-11:00 Session 11
09:00
Quality Gain Analysis of the Weighted Recombination Evolution Strategy on General Convex Quadratic Functions
SPEAKER: unknown

ABSTRACT. We investigate the evolution strategy with weighted recombination on a general convex quadratic function. We derive the asymptotic quality gain in the limit of the dimension to infinity, and derive the optimal recombination weights and the optimal step-size. This work is an extension of previous work that has derived the asymptotic quality gain of the evolution strategy with weighted recombination on the infinite dimensional sphere function. Our proof differs from the previous study that relies on the geometric interpretation of the algorithmic behavior on the infinite dimensional search space. Moreover, we derive rigorous bounds for the quality gain on a general quadratic function in a finite dimensional search space, which reveals the dependency of the quality gain both on the eigenvalue distribution of the Hessian matrix and on the recombination weights. Taking the search space dimension to infinity, it turns out that the optimal recombination weights are independent of the Hessian matrix, i.e., the recombination weights optimal for the sphere function are optimal for convex quadratic functions.

10:00
Qualitative and Quantitative Assessment of Step Size Adaptation Rules
SPEAKER: unknown

ABSTRACT. We present a comparison of step size adaptation methods for evolution strategies, covering recent developments in the field. Following Hansen et al. we formulate a concise list of performance criteria:

a) fast convergence of the mean, b) near-optimal fixed point of the normalized step size dynamics, and c) invariance to adding constant dimensions of the objective function.

Our results show that algorithms violating these principles tend to underestimate the step size or are unreliable when the function does not fit to the algorithm's tuned hyperparameters. In contrast, we find that cumulative step size adaptation (CSA) and two point adaptation(TSA) provide reliable estimates of the optimal step size. We further find that removing the evolution path of CSA still leads to a reliable algorithm without the computational requirements of CSA.

11:00-11:30Coffee Break
11:30-12:30 Session 12
11:30
On the Statistical Learning Ability of Evolution Strategies
SPEAKER: unknown

ABSTRACT. We explore the ability of Evolution Strategies (ESs) to statistically learn the local landscape. Specifically, we consider ESs operating only with isotropic Gaussian mutations near the optimum and investigate the covariance matrix when constructed out of selected individuals by truncation. Unlike previous studies, we do not make assumptions on the ES adaptation scheme; e.g., neither Derandomization nor Information Geometric Optimization are assumed. We prove that the statistically constructed covariance matrix over such winning decision vectors has the same eigenvectors as the Hessian matrix. We further prove that when the population size is increased, the covariance becomes proportional to the inverse of the Hessian. We also devise and corroborate an analytic approximation of this covariance matrix. In the framework we consider, this confirms the classical hypothesis that learning the landscape is an inherent property of standard ESs, and yet it is shown that this capability stems only from the usage of isotropic Gaussian mutations and rank-based selection.