FOGA 2017: FOUNDATIONS OF GENETIC ALGORITHMS XIV
PROGRAM FOR FRIDAY, JANUARY 13TH
Days:
previous day
next day
all days

View: session overviewtalk overview

09:30-10:30 Session 2
09:30
An Application of Stochastic Differential Equations to Evolutionary Algorithms
SPEAKER: unknown

ABSTRACT. There has been renewed interest in modelling the behaviour of evolutionary algorithms by more traditional mathematical objects, such as ordinary differential equations or Markov chains. The advantage is that the analysis becomes greatly facilitated due to the existence of well established methods. However, this typically comes at the cost of disregarding information about the process. Here, we introduce the use of stochastic differential equations (SDEs) for the study of EAs. SDEs can produce simple analytical results for the dynamics of stochastic processes, unlike Markov chains which can produce rigorous but unwieldy expressions about the dynamics. On the other hand, unlike ordinary differential equations (ODEs), they do not discard information about the stochasticity of the process. We show that these are especially suitable for the analysis of fixed budget scenarios and present analogs of the additive and multiplicative drift theorems for SDEs. We exemplify the use of these methods for two model algorithms ((1+1) EA and RLS) on two canonical problems(OneMax and LeadingOnes).

10:30-11:00Coffee Break
11:00-12:00 Session 3
11:00
Runtime Analysis of a Discrete Particle Swarm Optimization Algorithm on Sorting and OneMax

ABSTRACT. We present a discrete particle swarm optimization (PSO) algorithm that works on a significantly large class of discrete optimization problems. Assuming a black-box setting, we prove upper and lower bounds of the expected number of function evaluations required by the proposed algorithm to solve the sorting problem and the problem of maximizing the number of ones in a bitstring, e.g., the function OneMax. We show that depending on the probability of moving towards the attractor, the expected optimization time may be polynomial or exponential. The cornerstone of our analysis are Theta-bounds on the expected time it takes until the PSO returns to the attractor. We obtain these bounds by solving linear recurrence equations with constant and non-constant coefficients. We also introduce a useful indistinguishability property of states of a Markov chain in order to obtain lower bounds of the expected optimization time of our proposed PSO algorithm.

12:00-13:30Lunch Break
13:30-15:30 Session 4
13:30
Convergence of Factored Evolutionary Algorithms
SPEAKER: unknown

ABSTRACT. Factored Evolutionary Algorithms (FEA) have been found to be an effective way to optimize single objective functions by partitioning the variables in the function into overlapping subpopulations, or factors. While there exist several works empirically evaluating FEA, there exists very little literature exploring FEA's theoretical properties. In this paper, we prove that the final solution returned by FEA will converge to a single point. Additionally, we show how the convergence of FEA to a single point in the search space could be to a suboptimal point in space. However, we demonstrate empirically that when using specific factor architectures, the probability of converging to these suboptimal points in space approaches 0. Finally, we show how FEA is able to outperform hybrid versions of Cooperative Coevolutionary Evolutionary Algorithms.

14:30
Analysis of the (1+1) EA on Subclasses of Linear Functions under Uniform and Linear Constraints
SPEAKER: unknown

ABSTRACT. Linear functions have gained a lot of attention in the area of run time analysis of evolutionary computation methods and the corresponding analyses have provided many effective tools for analyzing more complex problems. In this paper, we consider the behavior of the classical (1+1) Evolutionary Algorithm for linear functions under a given linear constraint. We study cases where either the objective function and/or the constraint is given by the function OneMax and present upper and lower bounds for the different subclasses of problems. Furthermore, we also consider the LeadingOnes fitness function.

15:30-16:00Coffee Break
16:00-17:00 Session 5
16:00
Hypervolume Subset Selection for Triangular and Inverted Triangular Pareto Fronts of Three-Objective Problems
SPEAKER: unknown

ABSTRACT. Hypervolume subset selection is to find a pre-specified number of solutions for hypervolume maximization. The optimal distribution of solutions on the Pareto front has been theoretically studied for two-objective problems in the literature. In this paper, we discuss hypervolume subset selection for three-objective problems with triangular and inverted triangular Pareto fronts. Our contribution is to show that the effect of the reference point for hypervolume calculation on the optimal distribution of solutions is different between triangular and inverted triangular Pareto fronts. When the reference point is far from the Pareto front, most solutions are on the sides of the inverted triangular Pareto front while they are evenly distributed over the entire triangular Pareto front. These properties seem to hold in the case of four or more objectives. We also show that the effect of the reference point on the optimal distribution is different between maximization and minimization problems with triangular Pareto fronts. This property is obtained from the fact that a maximization problem with a triangular Pareto front is equivalent to a minimization problem with an inverted triangular Pareto front. The optimal distribution of solutions is also discussed when the reference point is very close to the Pareto front.