FOGA 2017: FOUNDATIONS OF GENETIC ALGORITHMS XIV
PROGRAM FOR SATURDAY, JANUARY 14TH
Days:
previous day
next day
all days

View: session overviewtalk overview

10:00-10:30Coffee Break
10:30-11:30 Session 7
10:30
On the Use of the Dual Formulation for Minimum Weighted Vertex Cover in Evolutionary Algorithms
SPEAKER: unknown

ABSTRACT. We consider the weighted minimum vertex cover problem and investigate how its dual formulation can be exploited to design evolutionary algorithms that provably obtain a 2-approximation. Investigating multi-valued representations, we show that variants of randomized local search and the (1+1)EA achieve this goal in expected pseudo-polynomial time. In order to speed up the process, we consider the use of step size adaptation in both algorithms and show that RLS obtains a 2-approximation in expected polynomial time while the (1+1)EA still encounters a pseudo-polynomial lower bound.

11:30-12:30 Session 8
11:30
Linearly Convergent Evolution Strategies via Augmented Lagrangian Constraint Handling
SPEAKER: unknown

ABSTRACT. We present an evolution strategy for continuous black-box optimization under multiple inequality constraints, where an augmented Lagrangian approach is used to handle the constraints. We formulate sufficient conditions for the linear convergence of the proposed algorithm in the case of linear inequality constraints on the class of functions such that the augmented Lagrangian corresponding to the problem, minus its value at the optimum of the problem and the associated Lagrange multiplier, is positive homogeneous of degree $2$. This class includes convex quadratic functions as a particular case. Our analysis relies on a Markov chain approach: we construct a homogeneous Markov chain and deduce the linear convergence assuming its stability. Simulations on the sphere function and on a moderately ill-conditioned ellipsoid function empirically validate the stability of the constructed Markov chain.

12:30-14:00Lunch Break
14:00-16:00 Session 9
14:00
Lower Bounds on the Run Time of the Univariate Marginal Distribution Algorithm on OneMax
SPEAKER: unknown

ABSTRACT. The Univariate Marginal Distribution Algorithm (UMDA), a popular estimation of distribution algorithm, is studied from a run time perspective. On the classical $\mathrm{OneMax}$ benchmark function, a lower bound of $\Omega(\mu\sqrt{n}+n\log n)$, where $\mu$ is the population size, on its expected run time is proved. This is the first direct lower bound on the run time of the UMDA. It is stronger than the bounds that follow from general black-box complexity theory and is matched by the run time of many evolutionary algorithms. The results are obtained through advanced analyses of the stochastic change of the frequencies of bit values maintained by the algorithm, including carefully designed potential functions. These techniques may prove useful in advancing the field of run time analysis for estimation of distribution algorithms in general.

15:00
Analysis of the Clearing Diversity-Preserving Mechanism
SPEAKER: unknown

ABSTRACT. Clearing is a niching method inspired by the principle of assigning the available resources among a subpopulation to a single individual. The clearing procedure supplies these resources only to the best individual of each subpopulation: the winner. So far, its analysis has been focused on experimental approaches that have shown that clearing is a powerful diversity mechanism. We use empirical analysis to highlight some of the characteristics that makes it an useful mechanism and runtime analysis to explain how and why it is a powerful method. We prove that a (\mu+1)EA with large enough population size and a phenotypic distance function always succeeds in optimising all functions of unitation for small niches in polynomial time, while a genotypic distance function requires exponential time. Finally, we prove that a (\mu+1)EA with phenotypic and genotypic distances is able to find both optima in Twomax for large niches in polynomial expected time.

16:00-16:30Coffee Break
16:30-17:30 Session 10
16:30
Resampling vs Recombination: a Statistical Run Time Estimation
SPEAKER: unknown

ABSTRACT. Noise is pervasive in real-world optimization, but there is still little understanding of the interplay between the operators of randomized search heuristics and explicit noise-handling techniques, such as statistical resampling. In this paper, we report on several statistical models and theoretical results that help to clarify this reciprocal relationship for a collection of randomized search heuristics on noisy functions.

We consider the optimization of pseudo-Boolean functions under additive posterior Gaussian noise and explore the trade-off between noise reduction and the computational cost of resampling. We first perform experiments to find the optimal parameters at a given noise intensity for a mutation-only evolutionary algorithm, a genetic algorithm employing recombination, an estimation of distribution algorithm (EDA), and an ant colony optimization algorithm. We then observe how the optimal parameter depends on the noise intensity for the different algorithms. Finally, we locate the point where statistical resampling costs more than it is worth in terms of run time. We find that the EA requires the highest number of resamples to obtain the best speed-up, whereas crossover reduces both the run time and the number of resamples required. Most surprisingly, we find that EDA-like algorithms require no resampling, and can handle noise implicitly.