GA against Pure Chance

The empirical results so far suggest that pure chance is working much better than genetic operators without fitness feedback.

In a next step we will compare GA with fitness feedback against pure chance. The problem with this comparison is that a fitness function presupposes the existence of a 'real' set of 'good values' to be able to be defined. It is an interesting question how to handle this problem on a theoretical meta level later, but here we need concrete examples.

We will start with the introductory example from above. There the simple fitness function $ y=x^2$ had been used. This function defines a subset in the set of integers $ \cal{N}$. Every time the evolving system produces as output an $ x \in \cal{N}$ the fitness function responds with a fitness value $ x^2 \in \cal{N}$. The existence of this fitness functions represents a certain kind of special knowledge which is used for the construction of this fitness function. Without this knowledge it would not be possible to define this fitness function.

The interesting question now is whether the mechanisms of a GA supported by a certain fitness function can improve in a way that they work at least with the same 'efficiency' and 'correctness' as the pure chance method. Here we make the following general assumption: a fitness function $ f^{*}$ belongs always to an environment $ E^{*}$ and every acting system with $ s^{+GA}$ or without $ s^{-GA}$ a GA is interacting with the structure $ \langle E^{*}, f^{*} \rangle$. Thus even a system without genetic operators has to deal with a fitness function $ f^{*}$ because the fitness function represents the behavior of the environment.

We will prepare the following experiment:

  1. Environment: We assume an environment structure $ \langle E^{*}, f^{*} \rangle$ which behaves according to the assumed fitness function $ f^{*}$. This behavior is the same for both competing systems. In the first experiment we assume as fitness function $ f^{*}=x^2$.
  2. System $ s^{+GA}$: Take a system with GA using crossover with and without mutation. At the end of each cycle the selection is done based on the fitness values provided by the environment, crossover is applied either solely or in combination with mutation, the next cycle starts. This induces an overall weak 'memory' effect.
  3. System $ s^{-GA}$: Take a system with pure chance. This results in a sequence of populations where the selection of members for each cycle is completely at random. Although the fitness function $ f^{*}$ of the environment will provide fitness values for these populations too - and therefore one can compare the 'fitness state' of both systems - these values have no special 'effect' because the individuals for the population of the next cycle will in every case be selected completely at random. No 'memory' effect.

Gerd Doeben-Henisch 2012-03-31