Genetic algorithms is the name given to a branch of computing that uses a similar process to biological evolution to discover good solutions to problems that have many possible solutions, some better than others. GAs are generally used where the fitness function (that is to say, the function that defines how good a solution is) is far too complex to even begin to work backwards to a good/best solution. Robocode is a Java teaching game that is played by programming the behaviour of simple fighting robots; it provides a good place to experiment with GAs as there are an essentially infinite number of possible bots which are only capable of having their fitness measured by pitting them in battles.

One of the biggest decisions was how to encode bot behaviours as genomes and then have the bots act on them. In order to avoid having to repeatedly compile robot behaviours, I wrote EvolveBot; it is capable of reading a text file and decoding it into a series of commands for each of the main robot methods (run, onScannedBullet, onHitByBullet etc.). To enable several different genomes to exist simultaneously, I then created lots of bots that inherit everything from EvolveBot except their name; this means that they can load genome files from different locations while using identical machinery to parse them. While this approach does work well, its not necessarily the most efficient way to work. Efficiency of battles could be improved by compiling the bots, at the cost of compilation of each bot once a generation and a loss of flexibility; considering that the most time consuming part of the program is running the battles, this is a good place to look for significant speed improvements.

Once I had EvolveBot written, along with the associated GeneticCode(to handle genetic operations) and Lineage(to allow familial links to be tracked), I had to write the fitness function. This job was already mostly done for me; Robocode includes the RobocodeEngine class that provides a simple way to run battles. RobotLeague takes an array of evolved bots plus an array of ‘yardstick bots’ against which they should be measured; it will also optionally pit each evolved bot against each other evolved bot. ScoreKeeper receives the detailed outcomes of each battle and distils them down into a ‘modified score percentage’ defined as . The reason for adding one to the top and bottom of this fraction is that very bad bots will typically get incredibly low scores; this modification will allow such bots to be differentiated- causing opponents to score lower gives a higher value.

After putting all this machinery in place, it was fairly straightforward to write RobotBreeder; it sits in an infinite loop until it is terminated, breeding generation after generation of new bots. It enters each generation into a RobotLeague, then receives back a ranked list of bots which it uses to weight the random selection of parents for the next generation; there is some additional code to prohibit incest, this is to prevent the gene pool becoming too limited by inbreeding. It also records the winner of each generation for further analysis and writes a log of the modified percentage scores of both the winner of each generation and the average for the generation as well as a few other values that can be useful.

Running 5000 generations of 50 bots each against spinbot showed promising results [csv]:



Generation winners have plateaued at around 90% while generation averages are at around 50%. Counter-intuitively this substantially lower average is actually promising: it is a sign that there is probably incompatibility in the gene pool; even after over a thousand generations with results that can’t be improved upon, new combinations are still being tried rather than getting stuck in an evolutionary dead end.

The full code is here and the full results for the 5000 generation run against spinbot (including all 5000 generation winners) are in this commit.